By: Syeda Ambreen Karim Bokhari
A significant number of hotel bookings are called-off due to cancellations or no-shows. The typical reasons for cancellations include change of plans, scheduling conflicts, etc. This is often made easier by the option to do so free of charge or preferably at a low cost which is beneficial to hotel guests but it is a less desirable and possibly revenue-diminishing factor for hotels to deal with. Such losses are particularly high on last-minute cancellations.
The new technologies involving online booking channels have dramatically changed customers’ booking possibilities and behavior. This adds a further dimension to the challenge of how hotels handle cancellations, which are no longer limited to traditional booking and guest characteristics.
The cancellation of bookings impact a hotel on various fronts:
The increasing number of cancellations calls for a Machine Learning based solution that can help in predicting which booking is likely to be cancelled. Star Hotels Group has a chain of hotels in Portugal, they are facing problems with the high number of booking cancellations and have reached out to your firm for data-driven solutions. You as a data scientist have to analyze the data provided to find which factors have a high influence on booking cancellations, build a predictive model that can predict which booking is going to be canceled in advance, and help in formulating profitable policies for cancellations and refunds.
The data contains the different attributes of customers' booking details. The detailed data dictionary is given below.
Data Dictionary
# this will help in making the Python code more structured automatically (good coding practice)
# %load_ext nb_black
import warnings
warnings.filterwarnings("ignore")
from statsmodels.tools.sm_exceptions import ConvergenceWarning
warnings.simplefilter("ignore", ConvergenceWarning)
# Libraries to help with reading and manipulating data
import pandas as pd
import numpy as np
# Library to split data
from sklearn.model_selection import train_test_split
# libaries to help with data visualization
import matplotlib.pyplot as plt
import seaborn as sns
# Removes the limit for the number of displayed columns
pd.set_option("display.max_columns", None)
# Sets the limit for the number of displayed rows
pd.set_option("display.max_rows", 200)
# To build model for prediction
import statsmodels.stats.api as sms
from statsmodels.stats.outliers_influence import variance_inflation_factor
import statsmodels.api as sm
from statsmodels.tools.tools import add_constant
from sklearn.linear_model import LogisticRegression
# To get diferent metric scores
from sklearn.metrics import (
f1_score,
accuracy_score,
recall_score,
precision_score,
confusion_matrix,
roc_auc_score,
plot_confusion_matrix,
precision_recall_curve,
roc_curve,
)
# To tune different models
from sklearn.model_selection import GridSearchCV
# To get diferent metric scores
from sklearn.metrics import (
f1_score,
accuracy_score,
recall_score,
precision_score,
confusion_matrix,
plot_confusion_matrix,
make_scorer,
)
data=pd.read_csv('StarHotelsGroup.csv')
df_copy=data.copy()
data.sample(5)
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 6415 | 2 | 0 | 2 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 113 | 2019 | 3 | 10 | Online | 0 | 0 | 0 | 88.40 | 2 | Canceled |
| 6248 | 3 | 0 | 0 | 1 | Meal Plan 2 | 0 | Room_Type 4 | 133 | 2018 | 7 | 23 | Online | 0 | 0 | 0 | 189.00 | 0 | Canceled |
| 5216 | 2 | 1 | 2 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 196 | 2019 | 8 | 19 | Online | 0 | 0 | 0 | 132.53 | 0 | Canceled |
| 28192 | 3 | 0 | 2 | 2 | Meal Plan 1 | 0 | Room_Type 4 | 36 | 2018 | 5 | 1 | Online | 0 | 0 | 0 | 159.30 | 3 | Not_Canceled |
| 4440 | 2 | 0 | 0 | 4 | Meal Plan 1 | 0 | Room_Type 1 | 314 | 2019 | 8 | 1 | Online | 0 | 0 | 0 | 109.10 | 1 | Canceled |
data.shape
(56926, 18)
data.columns
Index(['no_of_adults', 'no_of_children', 'no_of_weekend_nights',
'no_of_week_nights', 'type_of_meal_plan', 'required_car_parking_space',
'room_type_reserved', 'lead_time', 'arrival_year', 'arrival_month',
'arrival_date', 'market_segment_type', 'repeated_guest',
'no_of_previous_cancellations', 'no_of_previous_bookings_not_canceled',
'avg_price_per_room', 'no_of_special_requests', 'booking_status'],
dtype='object')
data.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 56926 entries, 0 to 56925 Data columns (total 18 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 no_of_adults 56926 non-null int64 1 no_of_children 56926 non-null int64 2 no_of_weekend_nights 56926 non-null int64 3 no_of_week_nights 56926 non-null int64 4 type_of_meal_plan 56926 non-null object 5 required_car_parking_space 56926 non-null int64 6 room_type_reserved 56926 non-null object 7 lead_time 56926 non-null int64 8 arrival_year 56926 non-null int64 9 arrival_month 56926 non-null int64 10 arrival_date 56926 non-null int64 11 market_segment_type 56926 non-null object 12 repeated_guest 56926 non-null int64 13 no_of_previous_cancellations 56926 non-null int64 14 no_of_previous_bookings_not_canceled 56926 non-null int64 15 avg_price_per_room 56926 non-null float64 16 no_of_special_requests 56926 non-null int64 17 booking_status 56926 non-null object dtypes: float64(1), int64(13), object(4) memory usage: 7.8+ MB
data.isnull().sum()
no_of_adults 0 no_of_children 0 no_of_weekend_nights 0 no_of_week_nights 0 type_of_meal_plan 0 required_car_parking_space 0 room_type_reserved 0 lead_time 0 arrival_year 0 arrival_month 0 arrival_date 0 market_segment_type 0 repeated_guest 0 no_of_previous_cancellations 0 no_of_previous_bookings_not_canceled 0 avg_price_per_room 0 no_of_special_requests 0 booking_status 0 dtype: int64
data[data.duplicated()].count()
no_of_adults 14350 no_of_children 14350 no_of_weekend_nights 14350 no_of_week_nights 14350 type_of_meal_plan 14350 required_car_parking_space 14350 room_type_reserved 14350 lead_time 14350 arrival_year 14350 arrival_month 14350 arrival_date 14350 market_segment_type 14350 repeated_guest 14350 no_of_previous_cancellations 14350 no_of_previous_bookings_not_canceled 14350 avg_price_per_room 14350 no_of_special_requests 14350 booking_status 14350 dtype: int64
data.drop_duplicates(inplace=True)
data.shape
(42576, 18)
df=data.copy()
# Make catagorical variable into catagory.
data['type_of_meal_plan']=data.type_of_meal_plan.astype('category')
data['room_type_reserved']=data.room_type_reserved.astype('category')
data['market_segment_type']=data.market_segment_type.astype('category')
data['booking_status']=data.booking_status.astype('category')
data.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 42576 entries, 0 to 56924 Data columns (total 18 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 no_of_adults 42576 non-null int64 1 no_of_children 42576 non-null int64 2 no_of_weekend_nights 42576 non-null int64 3 no_of_week_nights 42576 non-null int64 4 type_of_meal_plan 42576 non-null category 5 required_car_parking_space 42576 non-null int64 6 room_type_reserved 42576 non-null category 7 lead_time 42576 non-null int64 8 arrival_year 42576 non-null int64 9 arrival_month 42576 non-null int64 10 arrival_date 42576 non-null int64 11 market_segment_type 42576 non-null category 12 repeated_guest 42576 non-null int64 13 no_of_previous_cancellations 42576 non-null int64 14 no_of_previous_bookings_not_canceled 42576 non-null int64 15 avg_price_per_room 42576 non-null float64 16 no_of_special_requests 42576 non-null int64 17 booking_status 42576 non-null category dtypes: category(4), float64(1), int64(13) memory usage: 5.0 MB
data.describe().T
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| no_of_adults | 42576.0 | 1.916737 | 0.527524 | 0.0 | 2.0 | 2.0 | 2.0 | 4.0 |
| no_of_children | 42576.0 | 0.142146 | 0.459920 | 0.0 | 0.0 | 0.0 | 0.0 | 10.0 |
| no_of_weekend_nights | 42576.0 | 0.895270 | 0.887864 | 0.0 | 0.0 | 1.0 | 2.0 | 8.0 |
| no_of_week_nights | 42576.0 | 2.321167 | 1.519328 | 0.0 | 1.0 | 2.0 | 3.0 | 17.0 |
| required_car_parking_space | 42576.0 | 0.034362 | 0.182160 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
| lead_time | 42576.0 | 77.315953 | 77.279616 | 0.0 | 16.0 | 53.0 | 118.0 | 521.0 |
| arrival_year | 42576.0 | 2018.297891 | 0.626126 | 2017.0 | 2018.0 | 2018.0 | 2019.0 | 2019.0 |
| arrival_month | 42576.0 | 6.365488 | 3.051924 | 1.0 | 4.0 | 6.0 | 9.0 | 12.0 |
| arrival_date | 42576.0 | 15.682873 | 8.813991 | 1.0 | 8.0 | 16.0 | 23.0 | 31.0 |
| repeated_guest | 42576.0 | 0.030886 | 0.173011 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
| no_of_previous_cancellations | 42576.0 | 0.025413 | 0.358194 | 0.0 | 0.0 | 0.0 | 0.0 | 13.0 |
| no_of_previous_bookings_not_canceled | 42576.0 | 0.222731 | 2.242308 | 0.0 | 0.0 | 0.0 | 0.0 | 72.0 |
| avg_price_per_room | 42576.0 | 112.375800 | 40.865896 | 0.0 | 85.5 | 107.0 | 135.0 | 540.0 |
| no_of_special_requests | 42576.0 | 0.768109 | 0.837264 | 0.0 | 0.0 | 1.0 | 1.0 | 5.0 |
data.describe(include='category').T
| count | unique | top | freq | |
|---|---|---|---|---|
| type_of_meal_plan | 42576 | 4 | Meal Plan 1 | 31863 |
| room_type_reserved | 42576 | 7 | Room_Type 1 | 29730 |
| market_segment_type | 42576 | 5 | Online | 34169 |
| booking_status | 42576 | 2 | Not_Canceled | 28089 |
data.nunique()
no_of_adults 5 no_of_children 6 no_of_weekend_nights 9 no_of_week_nights 18 type_of_meal_plan 4 required_car_parking_space 2 room_type_reserved 7 lead_time 397 arrival_year 3 arrival_month 12 arrival_date 31 market_segment_type 5 repeated_guest 2 no_of_previous_cancellations 9 no_of_previous_bookings_not_canceled 73 avg_price_per_room 4939 no_of_special_requests 6 booking_status 2 dtype: int64
# filtering object type columns
cat_columns = data.describe(include=["category"]).columns
cat_columns
Index(['type_of_meal_plan', 'room_type_reserved', 'market_segment_type',
'booking_status'],
dtype='object')
for i in cat_columns:
print("Unique values in", i, "are :")
print(data[i].value_counts())
print("*" * 50)
Unique values in type_of_meal_plan are : Meal Plan 1 31863 Not Selected 8716 Meal Plan 2 1989 Meal Plan 3 8 Name: type_of_meal_plan, dtype: int64 ************************************************** Unique values in room_type_reserved are : Room_Type 1 29730 Room_Type 4 9369 Room_Type 6 1540 Room_Type 5 906 Room_Type 2 718 Room_Type 7 307 Room_Type 3 6 Name: room_type_reserved, dtype: int64 ************************************************** Unique values in market_segment_type are : Online 34169 Offline 5777 Corporate 1939 Complementary 496 Aviation 195 Name: market_segment_type, dtype: int64 ************************************************** Unique values in booking_status are : Not_Canceled 28089 Canceled 14487 Name: booking_status, dtype: int64 **************************************************
columns1 = list(data)[0:-1] # Excluding Outcome column which has only
data[columns1].hist(stacked=False, bins=100, figsize=(12,30), layout=(14,2));
# Histogram of first 8 columns
# Function to create barplots that indicate percentage for each category.
def bar_perc(plot, feature):
'''
plot
feature: 1-d categorical feature array
'''
total = len(feature) # length of the column
for p in ax.patches:
percentage = '{:.1f}%'.format(100 * p.get_height()/total) # percentage of each class of the category
x = p.get_x() + p.get_width() / 2 - 0.05 # width of the plot
y = p.get_y() + p.get_height() # hieght of the plot
ax.annotate(percentage, (x, y), size = 12) # annotate the percantage
def dist_box_violin(data):
# function plots a combined graph for univariate analysis of continous variable
#to check spread, central tendency , dispersion and outliers
Name=data.name.upper()
fig, axes =plt.subplots(1,3,figsize=(17, 7))
fig.suptitle("SPREAD OF DATA FOR "+ Name , fontsize=18, fontweight='bold')
sns.distplot(data,kde=False,color='Blue',ax=axes[0])
axes[0].axvline(data.mean(), color='y', linestyle='--',linewidth=2)
axes[0].axvline(data.median(), color='r', linestyle='dashed', linewidth=2)
axes[0].axvline(data.mode()[0],color='g',linestyle='solid',linewidth=2)
axes[0].legend({'Mean':data.mean(),'Median':data.median(),'Mode':data.mode()})
sns.boxplot(x=data,showmeans=True, orient='h',color="purple",ax=axes[1])
#just exploring violin plot
sns.violinplot(data,ax=axes[2],showmeans=True)
#dist_box_violin(enews.time_spent_on_the_page)
# function to create labeled barplots
def labeled_barplot(data, feature, perc=False, n=None):
"""
Barplot with percentage at the top
data: dataframe
feature: dataframe column
perc: whether to display percentages instead of count (default is False)
n: displays the top n category levels (default is None, i.e., display all levels)
"""
total = len(data[feature]) # length of the column
count = data[feature].nunique()
if n is None:
plt.figure(figsize=(count + 1, 5))
else:
plt.figure(figsize=(n + 1, 5))
plt.xticks(rotation=90, fontsize=15)
ax = sns.countplot(
data=data,
x=feature,
palette="winter",
order=data[feature].value_counts().index[:n].sort_values(),
)
for p in ax.patches:
if perc == True:
label = "{:.1f}%".format(
100 * p.get_height() / total
) # percentage of each class of the category
else:
label = p.get_height() # count of each level of the category
x = p.get_x() + p.get_width() / 2 # width of the plot
y = p.get_height() # height of the plot
ax.annotate(
label,
(x, y),
ha="center",
va="center",
size=12,
xytext=(0, 5),
textcoords="offset points",
) # annotate the percentage
plt.show() # show the plot
col_list1=['no_of_adults',
'no_of_children',
'no_of_weekend_nights',
'no_of_week_nights',
'required_car_parking_space',
'lead_time',
'arrival_year',
'arrival_month',
'arrival_date',
'repeated_guest',
'no_of_previous_cancellations',
'no_of_previous_bookings_not_canceled',
'avg_price_per_room',
'no_of_special_requests']
for i in col_list1:
dist_box_violin(data[i])
cat_columns
Index(['type_of_meal_plan', 'room_type_reserved', 'market_segment_type',
'booking_status'],
dtype='object')
list_col=cat_columns
for i in range(len(list_col)):
#plt.title=list_col[i].upper()
labeled_barplot(data, list_col[i], perc=True)
# function to plot stacked bar chart
def stacked_barplot(data, predictor, target):
"""
Print the category counts and plot a stacked bar chart
data: dataframe
predictor: independent variable
target: target variable
"""
count = data[predictor].nunique()
sorter = data[target].value_counts().index[-1]
tab1 = pd.crosstab(data[predictor], data[target], margins=True).sort_values(
by=sorter, ascending=False
)
print(tab1)
print("-" * 120)
tab = pd.crosstab(data[predictor], data[target], normalize="index").sort_values(
by=sorter, ascending=False
)
tab.plot(kind="bar", stacked=True, figsize=(count + 5, 6))
plt.legend(
loc="lower left", frameon=False,
)
plt.legend(loc="upper left", bbox_to_anchor=(1, 1))
plt.show()
### function to plot distributions wrt target
def distribution_plot_wrt_target(data, predictor, target):
fig, axs = plt.subplots(2, 2, figsize=(12, 10))
target_uniq = data[target].unique()
axs[0, 0].set_title("Distribution of target for target=" + str(target_uniq[0]))
sns.histplot(
data=data[data[target] == target_uniq[0]],
x=predictor,
kde=True,
ax=axs[0, 0],
color="teal",
stat="density",
)
axs[0, 1].set_title("Distribution of target for target=" + str(target_uniq[1]))
sns.histplot(
data=data[data[target] == target_uniq[1]],
x=predictor,
kde=True,
ax=axs[0, 1],
color="orange",
stat="density",
)
axs[1, 0].set_title("Boxplot w.r.t target")
sns.boxplot(data=data, x=target, y=predictor, ax=axs[1, 0], palette="gist_rainbow")
axs[1, 1].set_title("Boxplot (without outliers) w.r.t target")
sns.boxplot(
data=data,
x=target,
y=predictor,
ax=axs[1, 1],
showfliers=False,
palette="gist_rainbow",
)
plt.tight_layout()
plt.show()
distribution_plot_wrt_target(data, "lead_time", "booking_status")
#axis=sns.countplot(x='release_year', data=cell ,palette='plasma').set(title='release_year')
#bar_perc(axis,['release_year'])
# Function to create barplots that indicate percentage for each category.
def bar_perc(plot, feature):
'''
plot
feature: 1-d categorical feature array
'''
total = len(feature) # length of the column
for p in ax.patches:
percentage = '{:.1f}%'.format(100 * p.get_height()/total) # percentage of each class of the category
x = p.get_x() + p.get_width() / 2 - 0.05 # width of the plot
y = p.get_y() + p.get_height() # hieght of the plot
ax.annotate(percentage, (x, y), size = 12) # annotate the percantage
# percentage of products
#ax = sns.countplot(cell['release_year'],palette='plasma')
#plt.title('Percentage of cell phones with years')
#plt.xlabel('Release Year')
#plt.ylabel('Count')
#bar_perc(ax,cell['release_year'])
distribution_plot_wrt_target(data, "avg_price_per_room", "booking_status")
Questions:
## Q1. What are the busiest months in the hotel?
data['arrival_month'].value_counts().sort_values(ascending=False)
8 5312 7 4725 5 4348 4 4227 6 4073 3 4044 10 3209 9 3057 2 2889 12 2385 11 2192 1 2115 Name: arrival_month, dtype: int64
# Display count plots and pie charts of categorical variables
# function to display count plots and pie charts of categorical variable: language_preferred
colors_list = ['#3366cc','#651593','#a03b87','#e4a859','#da8266',"#FAAE7B", '#ffcc00','#ffff66']
f,ax=plt.subplots(1,2,figsize=(15,5))
data['arrival_month'].value_counts().plot.pie(autopct='%1.1f%%',ax=ax[0],startangle=90,colors = colors_list,pctdistance=1.1, labeldistance=1.3)
ax[0].set_title('Arrival month')
ax[0].set_ylabel('')
sns.countplot('arrival_month',data=data,ax=ax[1],palette =(['#651593','#3366cc','#a03b87','#da8266','#ffcc00',"#FAAE7B",'#e4a859','#ffff66']))
ax[1].set_title('Arrival month')
plt.show()
labeled_barplot(data, "arrival_month")
stacked_barplot(data, "arrival_month", "booking_status")
booking_status Canceled Not_Canceled All arrival_month All 14487 28089 42576 8 2475 2837 5312 7 2240 2485 4725 5 1674 2674 4348 4 1627 2600 4227 6 1584 2489 4073 3 1195 2849 4044 10 918 2291 3209 9 888 2169 3057 2 796 2093 2889 11 496 1696 2192 12 340 2045 2385 1 254 1861 2115 ------------------------------------------------------------------------------------------------------------------------
## Q2. Which market segment do most of the guests come from?
# Frequency table for Product
market= pd.crosstab(index=data['market_segment_type'], columns='count')
market
| col_0 | count |
|---|---|
| market_segment_type | |
| Aviation | 195 |
| Complementary | 496 |
| Corporate | 1939 |
| Offline | 5777 |
| Online | 34169 |
labeled_barplot(data, "market_segment_type", perc=True)
# Display count plots and pie charts of categorical variables
# function to display count plots and pie charts of categorical variable: language_preferred
colors_list = ['#3366cc','#651593','#a03b87','#e4a859','#da8266',"#FAAE7B", '#ffcc00','#ffff66']
f,ax=plt.subplots(1,2,figsize=(15,5))
data['market_segment_type'].value_counts().plot.pie(autopct='%1.1f%%',ax=ax[0],startangle=90,colors = colors_list,pctdistance=1.1, labeldistance=1.3)
ax[0].set_title('Market segment')
ax[0].set_ylabel('')
sns.countplot('market_segment_type',data=data,ax=ax[1],palette =(['#651593','#3366cc','#a03b87','#da8266','#ffcc00',"#FAAE7B",'#e4a859','#ffff66']))
ax[1].set_title('Market segment')
plt.show()
# Function to create barplots that indicate percentage for each category.
def bar_perc(plot, feature):
'''
plot
feature: 1-d categorical feature array
'''
total = len(feature) # length of the column
for p in plot.patches:
percentage = '{:.1f}%'.format(100 * p.get_height()/total) # percentage of each class of the category
x = p.get_x() + p.get_width() / 2 - 0.05 # width of the plot
y = p.get_y() + p.get_height() # hieght of the plot
plot.annotate(percentage, (x, y), size = 12) # annotate the percentage
#fig1, axes1 =plt.subplots(1,3,figsize=(14,5))
#list_col=['os', 'four_g', 'five_g']
j=0
#for i in range(len(list_col)):
# order = cell[list_col[i]].value_counts(ascending=False).index # to display bar in ascending order
# axis=sns.countplot(x=list_col[i], data=cell , order=order,ax=axes1[i],palette='plasma').set(title=list_col[i].upper())
# bar_perc(axes1[i],cell[list_col[i]])
stacked_barplot(data, "market_segment_type", "booking_status")
booking_status Canceled Not_Canceled All market_segment_type All 14487 28089 42576 Online 13483 20686 34169 Offline 804 4973 5777 Corporate 167 1772 1939 Aviation 33 162 195 Complementary 0 496 496 ------------------------------------------------------------------------------------------------------------------------
## Q3. Hotel rates are dynamic and change according to demand and customer demographics. What are the differences in room prices in different market segments?
# Crosstable showing relationship of market_segment_type by avg_price_per_room
pd.crosstab(data.market_segment_type, data.avg_price_per_room)
| avg_price_per_room | 0.00 | 0.50 | 1.00 | 1.48 | 1.60 | 2.00 | 3.00 | 4.50 | 6.00 | 6.50 | 6.67 | 9.00 | 12.00 | 14.00 | 15.00 | 18.00 | 18.60 | 19.15 | 20.00 | 22.00 | 23.00 | 24.83 | 24.87 | 26.35 | 29.00 | 31.00 | 32.50 | 32.71 | 35.00 | 35.20 | 36.25 | 36.38 | 37.07 | 37.33 | 37.50 | 38.00 | 38.33 | 38.67 | 39.00 | 39.60 | 39.67 | 40.00 | 40.50 | 40.67 | 41.00 | 41.43 | 41.76 | 41.84 | 41.88 | 42.00 | 42.17 | 42.28 | 42.50 | 43.00 | 43.20 | 43.33 | 43.43 | 43.50 | 43.53 | 43.80 | 44.00 | 44.60 | 44.67 | 44.77 | 44.80 | 45.00 | 45.33 | 45.50 | 45.86 | 45.87 | 45.90 | 46.00 | 46.33 | 46.40 | 46.51 | 46.54 | 46.63 | 46.66 | 46.67 | 46.90 | 47.00 | 47.04 | 47.20 | 47.25 | 47.33 | 47.43 | 47.50 | 47.67 | 47.78 | 47.92 | 48.00 | 48.05 | 48.33 | 48.40 | 48.43 | 48.50 | 48.51 | 48.60 | 48.67 | 48.75 | 49.07 | 49.09 | 49.43 | 49.50 | 49.58 | 49.59 | 49.67 | 49.68 | 49.74 | 50.00 | 50.20 | 50.30 | 50.33 | 50.40 | 50.60 | 50.98 | 51.00 | 51.05 | 51.09 | 51.19 | 51.21 | 51.28 | 51.30 | 51.35 | 51.50 | 51.60 | 51.66 | 51.69 | 51.71 | 51.96 | 51.97 | 52.00 | 52.02 | 52.35 | 52.36 | 52.39 | 52.50 | 52.70 | 52.85 | 53.04 | 53.19 | 53.24 | 53.25 | 53.30 | 53.38 | 53.45 | 53.47 | 53.50 | 53.52 | 53.58 | 53.67 | 53.72 | 53.83 | 53.85 | 53.90 | 53.95 | 53.97 | 54.00 | 54.02 | 54.05 | 54.06 | 54.09 | 54.15 | 54.21 | 54.24 | 54.31 | 54.32 | 54.40 | 54.46 | 54.50 | 54.63 | 54.71 | 54.74 | 54.75 | 54.80 | 54.90 | 54.98 | 55.00 | 55.07 | 55.08 | 55.09 | 55.24 | 55.32 | 55.33 | 55.35 | 55.38 | 55.50 | 55.59 | 55.60 | 55.63 | 55.72 | 55.74 | 55.76 | 55.80 | 55.90 | 55.96 | 56.00 | 56.01 | 56.10 | 56.16 | 56.25 | 56.27 | 56.38 | 56.47 | 56.51 | 56.52 | 56.59 | 56.60 | 56.62 | 56.83 | 56.93 | 56.94 | 56.98 | 57.00 | 57.05 | 57.06 | 57.28 | 57.50 | 57.52 | 57.59 | 57.60 | 57.73 | 57.75 | 57.83 | 57.88 | 57.92 | 57.93 | 57.98 | 58.00 | 58.14 | 58.16 | 58.17 | 58.24 | 58.25 | 58.30 | 58.31 | 58.35 | 58.36 | 58.37 | 58.40 | 58.41 | 58.45 | 58.50 | 58.52 | 58.60 | 58.65 | 58.68 | 58.72 | 58.76 | 58.88 | 58.90 | 58.91 | 58.92 | 58.94 | 59.00 | 59.03 | 59.04 | 59.07 | 59.09 | 59.14 | 59.16 | 59.21 | 59.28 | 59.29 | 59.32 | 59.33 | 59.37 | 59.38 | 59.40 | 59.50 | 59.56 | 59.60 | 59.75 | 59.80 | 59.84 | 59.85 | 59.91 | 59.94 | 59.95 | 60.00 | 60.06 | 60.10 | 60.19 | 60.25 | 60.27 | 60.29 | 60.35 | 60.40 | 60.48 | 60.50 | 60.56 | 60.59 | 60.63 | 60.67 | 60.75 | 60.80 | 60.82 | 60.83 | 60.94 | 60.97 | 60.98 | 61.00 | 61.07 | 61.08 | 61.09 | 61.11 | 61.15 | 61.16 | 61.20 | 61.21 | 61.25 | 61.33 | 61.38 | 61.41 | 61.43 | 61.50 | 61.52 | 61.53 | 61.56 | 61.60 | 61.68 | 61.75 | 61.80 | 61.88 | 61.93 | 62.00 | 62.03 | 62.18 | 62.20 | 62.21 | 62.29 | 62.32 | 62.37 | 62.38 | 62.40 | 62.50 | 62.54 | 62.57 | 62.58 | 62.64 | 62.67 | 62.70 | 62.73 | 62.80 | 62.83 | 62.86 | 62.91 | 63.00 | 63.01 | 63.05 | 63.06 | 63.10 | 63.11 | 63.17 | 63.20 | 63.21 | 63.22 | 63.30 | 63.31 | 63.33 | 63.36 | 63.37 | 63.40 | 63.42 | 63.46 | 63.49 | 63.50 | 63.55 | 63.58 | 63.60 | 63.71 | 63.75 | 63.76 | 63.80 | 63.82 | 63.86 | 63.87 | 63.88 | 63.92 | 64.00 | 64.03 | 64.05 | 64.08 | 64.14 | 64.15 | 64.22 | 64.26 | 64.31 | 64.33 | 64.37 | 64.40 | 64.45 | 64.46 | 64.50 | 64.56 | 64.60 | 64.64 | 64.67 | 64.68 | 64.72 | 64.75 | 64.80 | 64.95 | 64.98 | 65.00 | 65.02 | 65.06 | 65.07 | 65.10 | 65.11 | 65.13 | 65.14 | 65.18 | 65.19 | 65.20 | 65.29 | 65.33 | 65.36 | 65.40 | 65.42 | 65.45 | 65.46 | 65.49 | 65.50 | 65.52 | 65.54 | 65.55 | 65.58 | 65.60 | 65.66 | 65.67 | 65.75 | 65.76 | 65.79 | 65.80 | 65.83 | 65.84 | 65.91 | 65.96 | 66.00 | 66.13 | 66.14 | 66.18 | 66.22 | 66.24 | 66.25 | 66.29 | 66.30 | 66.31 | 66.37 | 66.40 | 66.42 | 66.43 | 66.45 | 66.49 | 66.50 | 66.53 | 66.57 | 66.58 | 66.59 | 66.60 | 66.64 | 66.70 | 66.71 | 66.72 | 66.76 | 66.81 | 66.88 | 66.98 | 66.99 | 67.00 | 67.05 | 67.09 | 67.15 | 67.22 | 67.25 | 67.26 | 67.30 | 67.31 | 67.32 | 67.40 | 67.42 | 67.45 | 67.49 | 67.50 | 67.52 | 67.57 | 67.58 | 67.59 | 67.60 | 67.66 | 67.68 | 67.70 | 67.76 | 67.77 | 67.81 | 67.83 | 67.85 | 67.87 | 67.90 | 67.91 | 67.95 | 67.99 | 68.00 | 68.04 | 68.07 | 68.14 | 68.17 | 68.20 | 68.22 | 68.23 | 68.25 | 68.26 | 68.30 | 68.40 | 68.42 | 68.45 | 68.47 | 68.50 | 68.51 | 68.53 | 68.55 | 68.56 | 68.57 | 68.61 | 68.64 | 68.65 | 68.66 | 68.67 | 68.68 | 68.69 | 68.70 | 68.71 | 68.72 | 68.76 | 68.77 | 68.79 | 68.80 | 68.84 | 68.85 | 68.88 | 68.99 | 69.00 | 69.01 | 69.02 | 69.05 | 69.10 | 69.11 | 69.12 | 69.20 | 69.22 | 69.29 | 69.30 | 69.33 | 69.35 | 69.36 | 69.38 | 69.40 | 69.42 | 69.44 | 69.49 | 69.50 | 69.52 | 69.53 | 69.56 | 69.57 | 69.58 | 69.60 | 69.64 | 69.70 | 69.74 | 69.75 | 69.76 | 69.77 | 69.82 | 69.84 | 69.90 | 69.92 | 69.98 | 69.99 | 70.00 | 70.03 | 70.04 | 70.07 | 70.11 | 70.12 | 70.13 | 70.14 | 70.20 | 70.26 | 70.30 | 70.33 | 70.39 | 70.40 | 70.41 | 70.42 | 70.43 | 70.44 | 70.46 | 70.47 | 70.50 | 70.55 | 70.56 | 70.66 | 70.67 | 70.68 | 70.69 | 70.72 | 70.73 | 70.75 | 70.78 | 70.82 | 70.83 | 70.84 | 70.85 | 70.87 | 70.88 | 70.91 | 70.99 | 71.00 | 71.05 | 71.10 | 71.19 | 71.20 | 71.21 | 71.22 | 71.23 | 71.25 | 71.28 | 71.30 | 71.33 | 71.37 | 71.38 | 71.40 | 71.41 | 71.42 | 71.46 | 71.52 | 71.53 | 71.55 | 71.57 | 71.60 | 71.64 | 71.65 | 71.67 | 71.69 | 71.70 | 71.71 | 71.73 | 71.74 | 71.76 | 71.80 | 71.82 | 71.83 | 71.84 | 71.85 | 71.86 | 71.87 | 71.90 | 71.91 | 72.00 | 72.07 | 72.08 | 72.09 | 72.10 | 72.12 | 72.15 | 72.16 | 72.20 | 72.22 | 72.24 | 72.25 | 72.26 | 72.31 | 72.33 | 72.36 | 72.37 | 72.38 | 72.39 | 72.40 | 72.42 | 72.47 | 72.49 | 72.50 | 72.58 | 72.63 | 72.64 | 72.65 | 72.67 | 72.69 | 72.72 | 72.75 | 72.76 | 72.77 | 72.80 | 72.84 | 72.86 | 72.90 | 72.91 | 72.93 | 72.94 | 72.96 | 73.00 | 73.03 | 73.05 | 73.07 | 73.08 | 73.10 | 73.12 | 73.15 | 73.20 | 73.21 | 73.30 | 73.32 | 73.35 | 73.36 | 73.37 | 73.39 | 73.40 | 73.41 | 73.42 | 73.44 | 73.45 | 73.46 | 73.50 | 73.51 | 73.56 | 73.60 | 73.61 | 73.64 | 73.65 | 73.67 | 73.73 | 73.74 | 73.77 | 73.80 | 73.81 | 73.83 | 73.87 | 73.88 | 73.90 | 73.92 | 73.94 | 73.95 | 73.96 | 74.00 | 74.06 | 74.08 | 74.10 | 74.13 | 74.14 | 74.15 | 74.20 | 74.23 | 74.25 | 74.29 | 74.31 | 74.33 | 74.38 | 74.39 | 74.40 | 74.44 | 74.48 | 74.50 | 74.54 | 74.60 | 74.61 | 74.64 | 74.65 | 74.67 | 74.68 | 74.69 | 74.70 | 74.80 | 74.82 | 74.83 | 74.84 | 74.88 | 74.89 | 74.91 | 74.93 | 74.94 | 74.95 | 75.00 | 75.05 | 75.08 | 75.10 | 75.14 | 75.15 | 75.16 | 75.18 | 75.20 | 75.23 | 75.24 | 75.25 | 75.27 | 75.28 | 75.30 | 75.33 | 75.34 | 75.35 | 75.38 | 75.40 | 75.42 | 75.43 | 75.44 | 75.46 | 75.47 | 75.48 | 75.50 | 75.51 | 75.53 | 75.54 | 75.56 | 75.60 | 75.65 | 75.66 | 75.68 | 75.71 | 75.75 | 75.76 | 75.79 | 75.80 | 75.81 | 75.82 | 75.86 | 75.88 | 75.89 | 75.90 | 75.92 | 75.93 | 75.95 | 75.99 | 76.00 | 76.05 | 76.08 | 76.10 | 76.12 | 76.14 | 76.16 | 76.20 | 76.22 | 76.23 | 76.24 | 76.25 | 76.27 | 76.29 | 76.30 | 76.32 | 76.33 | 76.36 | 76.38 | 76.42 | 76.44 | 76.46 | 76.47 | 76.48 | 76.50 | 76.56 | 76.58 | 76.59 | 76.67 | 76.69 | 76.75 | 76.76 | 76.77 | 76.78 | 76.80 | 76.81 | 76.84 | 76.85 | 76.90 | 76.91 | 76.92 | 76.93 | 76.94 | 76.95 | 76.97 | 77.00 | 77.03 | 77.04 | 77.10 | 77.14 | 77.18 | 77.20 | 77.21 | 77.23 | 77.25 | 77.29 | 77.33 | 77.35 | 77.39 | 77.40 | 77.44 | 77.47 | 77.50 | 77.51 | 77.52 | 77.53 | 77.55 | 77.56 | 77.57 | 77.60 | 77.62 | 77.63 | 77.68 | 77.69 | 77.71 | 77.73 | 77.75 | 77.76 | 77.77 | 77.78 | 77.85 | 77.86 | 77.87 | 77.88 | 77.89 | 77.92 | 77.93 | 77.99 | 78.00 | 78.03 | 78.08 | 78.10 | 78.15 | 78.17 | 78.20 | 78.25 | 78.26 | 78.30 | 78.31 | 78.33 | 78.35 | 78.40 | 78.41 | 78.42 | 78.44 | 78.45 | 78.48 | 78.49 | 78.50 | 78.52 | 78.53 | 78.54 | 78.55 | 78.60 | 78.61 | 78.63 | 78.65 | 78.66 | 78.69 | 78.71 | 78.76 | 78.77 | 78.80 | 78.81 | 78.84 | 78.85 | 78.88 | 78.90 | 78.94 | 78.95 | 78.96 | 78.99 | 79.00 | 79.03 | 79.05 | 79.11 | 79.14 | 79.15 | 79.20 | 79.22 | 79.25 | 79.29 | 79.30 | 79.31 | 79.33 | 79.35 | 79.36 | 79.38 | 79.39 | 79.40 | 79.43 | 79.46 | 79.47 | 79.48 | 79.50 | 79.53 | 79.54 | 79.56 | 79.57 | 79.59 | 79.60 | 79.62 | 79.63 | 79.64 | 79.65 | 79.66 | 79.67 | 79.69 | 79.70 | 79.72 | 79.75 | 79.77 | 79.80 | 79.82 | 79.85 | 79.88 | 79.90 | 79.92 | 80.00 | 80.01 | 80.08 | 80.09 | 80.10 | 80.14 | 80.19 | 80.20 | 80.24 | 80.26 | 80.27 | 80.30 | 80.32 | 80.33 | 80.39 | 80.40 | 80.41 | 80.42 | 80.46 | 80.47 | 80.50 | 80.55 | 80.60 | 80.61 | 80.67 | 80.70 | 80.74 | 80.75 | 80.76 | 80.79 | 80.80 | 80.82 | 80.85 | 80.87 | 80.92 | 81.00 | 81.07 | 81.08 | 81.09 | 81.10 | 81.16 | 81.19 | 81.20 | 81.23 | 81.25 | 81.26 | 81.30 | 81.31 | 81.35 | 81.36 | 81.39 | 81.40 | 81.43 | 81.44 | 81.47 | 81.50 | 81.54 | 81.55 | 81.56 | 81.60 | 81.62 | 81.67 | 81.70 | 81.73 | 81.75 | 81.76 | 81.77 | 81.80 | 81.81 | 81.82 | 81.86 | 81.88 | 81.90 | 81.95 | 81.96 | 82.00 | 82.07 | 82.08 | 82.10 | 82.11 | 82.12 | 82.13 | 82.17 | 82.18 | 82.19 | 82.20 | 82.23 | 82.28 | 82.30 | 82.33 | 82.35 | 82.39 | 82.40 | 82.42 | 82.44 | 82.45 | 82.47 | 82.50 | 82.59 | 82.61 | 82.62 | 82.65 | 82.66 | 82.67 | 82.69 | 82.70 | 82.72 | 82.73 | 82.76 | 82.79 | 82.80 | 82.83 | 82.84 | 82.87 | 82.88 | 82.90 | 82.92 | 82.94 | 82.95 | 82.96 | 82.98 | 82.99 | 83.00 | 83.01 | 83.03 | 83.04 | 83.05 | 83.09 | 83.10 | 83.12 | 83.16 | 83.18 | 83.20 | 83.25 | 83.30 | 83.33 | 83.38 | 83.39 | 83.40 | 83.47 | 83.48 | 83.50 | 83.55 | 83.57 | 83.58 | 83.59 | 83.60 | 83.61 | 83.64 | 83.66 | 83.70 | 83.71 | 83.73 | 83.75 | 83.78 | 83.79 | 83.80 | 83.83 | 83.85 | 83.87 | 83.90 | 83.92 | 83.93 | 83.96 | 84.00 | 84.01 | 84.03 | 84.05 | 84.08 | 84.13 | 84.14 | 84.15 | 84.17 | 84.20 | 84.24 | 84.25 | 84.30 | 84.31 | 84.32 | 84.33 | 84.35 | 84.38 | 84.39 | 84.43 | 84.45 | 84.50 | 84.51 | 84.52 | 84.55 | 84.56 | 84.57 | 84.58 | 84.60 | 84.65 | 84.67 | 84.69 | 84.70 | 84.72 | 84.73 | 84.80 | 84.82 | 84.83 | 84.85 | 84.90 | 84.92 | 84.95 | 84.96 | 85.00 | 85.01 | 85.03 | 85.05 | 85.07 | 85.16 | 85.17 | 85.20 | 85.24 | 85.27 | 85.28 | 85.29 | 85.30 | 85.32 | 85.33 | 85.35 | 85.38 | 85.46 | 85.47 | 85.50 | 85.51 | 85.55 | 85.57 | 85.59 | 85.60 | 85.61 | 85.63 | 85.67 | 85.68 | 85.70 | 85.75 | 85.77 | 85.78 | 85.80 | 85.85 | 85.86 | 85.87 | 85.91 | 85.92 | 85.93 | 85.96 | 85.98 | 86.00 | 86.05 | 86.06 | 86.07 | 86.09 | 86.10 | 86.12 | 86.15 | 86.20 | 86.21 | 86.24 | 86.25 | 86.28 | 86.30 | 86.32 | 86.33 | 86.35 | 86.36 | 86.40 | 86.43 | 86.45 | 86.47 | 86.49 | 86.50 | 86.53 | 86.55 | 86.58 | 86.60 | 86.61 | 86.62 | 86.63 | 86.67 | 86.70 | 86.72 | 86.75 | 86.76 | 86.78 | 86.80 | 86.82 | 86.85 | 86.86 | 86.90 | 86.92 | 86.93 | 86.98 | 87.00 | 87.01 | 87.03 | 87.04 | 87.05 | 87.08 | 87.09 | 87.12 | 87.13 | 87.18 | 87.20 | 87.21 | 87.24 | 87.27 | 87.30 | 87.31 | 87.32 | 87.33 | 87.40 | 87.41 | 87.48 | 87.50 | 87.53 | 87.54 | 87.55 | 87.58 | 87.60 | 87.63 | 87.67 | 87.69 | 87.70 | 87.71 | 87.72 | 87.74 | 87.75 | 87.76 | 87.78 | 87.82 | 87.84 | 87.89 | 87.90 | 87.95 | 87.97 | 87.98 | 87.99 | 88.00 | 88.01 | 88.12 | 88.17 | 88.20 | 88.22 | 88.23 | 88.25 | 88.26 | 88.27 | 88.29 | 88.33 | 88.35 | 88.38 | 88.39 | 88.40 | 88.43 | 88.46 | 88.50 | 88.51 | 88.52 | 88.54 | 88.55 | 88.56 | 88.60 | 88.61 | 88.63 | 88.68 | 88.69 | 88.70 | 88.73 | 88.74 | 88.75 | 88.76 | 88.77 | 88.80 | 88.83 | 88.86 | 88.88 | 88.90 | 88.91 | 88.92 | 88.94 | 88.96 | 88.98 | 89.00 | 89.08 | 89.10 | 89.13 | 89.14 | 89.17 | 89.20 | 89.22 | 89.24 | 89.25 | 89.26 | 89.27 | 89.28 | 89.29 | 89.30 | 89.31 | 89.32 | 89.33 | 89.38 | 89.39 | 89.40 | 89.44 | 89.46 | 89.50 | 89.53 | 89.54 | 89.55 | 89.59 | 89.64 | 89.67 | 89.68 | 89.70 | 89.71 | 89.73 | 89.74 | 89.75 | 89.76 | 89.78 | 89.80 | 89.85 | 89.88 | 89.89 | 89.90 | 89.91 | 89.92 | 89.93 | 89.97 | 90.00 | 90.05 | 90.06 | 90.07 | 90.08 | 90.09 | 90.10 | 90.18 | 90.20 | 90.23 | 90.25 | 90.27 | 90.30 | 90.32 | 90.33 | 90.34 | 90.38 | 90.40 | 90.43 | 90.44 | 90.45 | 90.46 | 90.47 | 90.50 | 90.51 | 90.52 | 90.53 | 90.54 | 90.55 | 90.56 | 90.60 | 90.63 | 90.64 | 90.65 | 90.67 | 90.68 | 90.71 | 90.72 | 90.74 | 90.76 | 90.78 | 90.84 | 90.86 | 90.88 | 90.90 | 90.93 | 90.95 | 90.96 | 90.97 | 90.98 | 91.00 | 91.02 | 91.04 | 91.05 | 91.08 | 91.10 | 91.12 | 91.13 | 91.14 | 91.15 | 91.20 | 91.23 | 91.25 | 91.26 | 91.29 | 91.30 | 91.31 | 91.32 | 91.33 | 91.36 | 91.38 | 91.40 | 91.43 | 91.44 | 91.45 | 91.47 | 91.48 | 91.50 | 91.51 | 91.54 | 91.58 | 91.59 | 91.60 | 91.63 | 91.67 | 91.68 | 91.69 | 91.70 | 91.73 | 91.75 | 91.77 | 91.79 | 91.80 | 91.81 | 91.82 | 91.83 | 91.85 | 91.88 | 91.89 | 91.90 | 91.93 | 91.95 | 91.96 | 91.98 | 92.00 | 92.01 | 92.02 | 92.03 | 92.08 | 92.10 | 92.11 | 92.12 | 92.16 | 92.17 | 92.18 | 92.21 | 92.23 | 92.25 | 92.28 | 92.29 | 92.31 | 92.33 | 92.35 | 92.40 | 92.43 | 92.48 | 92.50 | 92.51 | 92.53 | 92.54 | 92.55 | 92.58 | 92.60 | 92.63 | 92.65 | 92.66 | 92.67 | 92.70 | 92.72 | 92.75 | 92.76 | 92.80 | 92.82 | 92.83 | 92.85 | 92.86 | 92.87 | 92.88 | 92.90 | 92.91 | 92.93 | 92.94 | 92.95 | 92.98 | 92.99 | 93.00 | 93.03 | 93.06 | 93.08 | 93.09 | 93.15 | 93.16 | 93.17 | 93.18 | 93.20 | 93.22 | 93.25 | 93.29 | 93.30 | 93.33 | 93.34 | 93.36 | 93.38 | 93.40 | 93.45 | 93.48 | 93.50 | 93.55 | 93.56 | 93.58 | 93.60 | 93.63 | 93.64 | 93.65 | 93.67 | 93.68 | 93.69 | 93.70 | 93.75 | 93.78 | 93.80 | 93.81 | 93.82 | 93.84 | 93.88 | 93.90 | 93.93 | 93.94 | 93.96 | 94.00 | 94.05 | 94.07 | 94.10 | 94.11 | 94.12 | 94.14 | 94.18 | 94.20 | 94.23 | 94.24 | 94.25 | 94.30 | 94.32 | 94.33 | 94.35 | 94.36 | 94.40 | 94.49 | 94.50 | 94.51 | 94.52 | 94.56 | 94.59 | 94.60 | 94.62 | 94.67 | 94.70 | 94.71 | 94.72 | 94.78 | 94.80 | 94.81 | 94.86 | 94.88 | 94.91 | 94.93 | 94.94 | 94.95 | 94.96 | 94.97 | 94.99 | 95.00 | 95.03 | 95.04 | 95.09 | 95.10 | 95.12 | 95.13 | 95.17 | 95.19 | 95.20 | 95.24 | 95.25 | 95.29 | 95.30 | 95.31 | 95.33 | 95.36 | 95.37 | 95.40 | 95.46 | 95.47 | 95.48 | 95.49 | 95.50 | 95.52 | 95.56 | 95.58 | 95.59 | 95.60 | 95.62 | 95.63 | 95.65 | 95.66 | 95.67 | 95.69 | 95.70 | 95.74 | 95.75 | 95.76 | 95.79 | 95.81 | 95.82 | 95.83 | 95.85 | 95.88 | 95.90 | 95.94 | 95.95 | 95.96 | 95.98 | 95.99 | 96.00 | 96.02 | 96.05 | 96.08 | 96.10 | 96.12 | 96.13 | 96.14 | 96.16 | 96.17 | 96.20 | 96.23 | 96.25 | 96.26 | 96.27 | 96.30 | 96.32 | 96.33 | 96.36 | 96.39 | 96.40 | 96.41 | 96.48 | 96.50 | 96.53 | 96.55 | 96.56 | 96.60 | 96.62 | 96.67 | 96.73 | 96.75 | 96.76 | 96.78 | 96.79 | 96.80 | 96.84 | 96.88 | 96.90 | 96.91 | 96.92 | 96.94 | 96.98 | 96.99 | 97.00 | 97.01 | 97.02 | 97.07 | 97.10 | 97.12 | 97.16 | 97.18 | 97.20 | 97.23 | 97.28 | 97.30 | 97.32 | 97.33 | 97.35 | 97.39 | 97.40 | 97.41 | 97.42 | 97.43 | 97.47 | 97.50 | 97.54 | 97.55 | 97.56 | 97.58 | 97.59 | 97.60 | 97.63 | 97.65 | 97.67 | 97.70 | 97.71 | 97.75 | 97.77 | 97.79 | 97.82 | 97.85 | 97.87 | 97.88 | 97.92 | 97.93 | 97.96 | 97.98 | 98.00 | 98.03 | 98.10 | 98.11 | 98.12 | 98.13 | 98.17 | 98.18 | 98.20 | 98.21 | 98.24 | 98.25 | 98.26 | 98.28 | 98.30 | 98.32 | 98.33 | 98.36 | 98.39 | 98.40 | 98.41 | 98.44 | 98.49 | 98.50 | 98.55 | 98.56 | 98.57 | 98.60 | 98.62 | 98.64 | 98.67 | 98.70 | 98.76 | 98.80 | 98.82 | 98.83 | 98.85 | 98.87 | 98.90 | 98.93 | 98.94 | 98.98 | 99.00 | 99.03 | 99.07 | 99.10 | 99.13 | 99.14 | 99.15 | 99.17 | 99.18 | 99.20 | 99.21 | 99.22 | 99.24 | 99.28 | 99.30 | 99.33 | 99.36 | 99.40 | 99.41 | 99.45 | 99.46 | 99.48 | 99.49 | 99.50 | 99.51 | 99.57 | 99.60 | 99.63 | 99.66 | 99.67 | 99.68 | 99.71 | 99.72 | 99.73 | 99.75 | 99.78 | 99.79 | 99.85 | 99.86 | 99.87 | 99.88 | 99.90 | 99.95 | 99.96 | 100.00 | 100.02 | 100.05 | 100.08 | 100.09 | 100.10 | 100.12 | 100.13 | 100.16 | 100.20 | 100.25 | 100.28 | 100.29 | 100.30 | 100.33 | 100.35 | 100.37 | 100.38 | 100.40 | 100.44 | 100.47 | 100.48 | 100.50 | 100.51 | 100.54 | 100.58 | 100.60 | 100.62 | 100.63 | 100.64 | 100.66 | 100.67 | 100.70 | 100.72 | 100.73 | 100.74 | 100.75 | 100.80 | 100.81 | 100.84 | 100.87 | 100.93 | 100.94 | 100.95 | 100.97 | 101.00 | 101.03 | 101.05 | 101.07 | 101.08 | 101.09 | 101.10 | 101.13 | 101.15 | 101.18 | 101.20 | 101.24 | 101.25 | 101.27 | 101.31 | 101.33 | 101.34 | 101.35 | 101.36 | 101.38 | 101.40 | 101.41 | 101.43 | 101.45 | 101.46 | 101.49 | 101.50 | 101.51 | 101.52 | 101.53 | 101.54 | 101.55 | 101.58 | 101.65 | 101.66 | 101.68 | 101.70 | 101.72 | 101.73 | 101.75 | 101.77 | 101.78 | 101.79 | 101.80 | 101.83 | 101.86 | 101.87 | 101.88 | 101.98 | 101.99 | 102.00 | 102.04 | 102.05 | 102.06 | 102.08 | 102.09 | 102.10 | 102.15 | 102.20 | 102.25 | 102.28 | 102.30 | 102.33 | 102.34 | 102.35 | 102.37 | 102.38 | 102.39 | 102.40 | 102.41 | 102.43 | 102.47 | 102.50 | 102.51 | 102.56 | 102.58 | 102.60 | 102.68 | 102.70 | 102.72 | 102.73 | 102.74 | 102.75 | 102.76 | 102.79 | 102.80 | 102.83 | 102.85 | 102.86 | 102.88 | 102.90 | 102.97 | 102.99 | 103.00 | 103.05 | 103.06 | 103.09 | 103.10 | 103.13 | 103.14 | 103.15 | 103.17 | 103.18 | 103.19 | 103.20 | 103.21 | 103.22 | 103.28 | 103.29 | 103.32 | 103.35 | 103.36 | 103.39 | 103.40 | 103.41 | 103.42 | 103.44 | 103.46 | 103.50 | 103.52 | 103.53 | 103.55 | 103.56 | 103.57 | 103.60 | 103.61 | 103.62 | 103.64 | 103.67 | 103.68 | 103.69 | 103.70 | 103.71 | 103.72 | 103.75 | 103.76 | 103.78 | 103.80 | 103.81 | 103.84 | 103.86 | 103.87 | 103.89 | 103.95 | 103.97 | 104.00 | 104.04 | 104.05 | 104.08 | 104.10 | 104.11 | 104.13 | 104.14 | 104.17 | 104.18 | 104.20 | 104.21 | 104.22 | 104.25 | 104.30 | 104.31 | 104.32 | 104.39 | 104.40 | 104.42 | 104.43 | 104.48 | 104.49 | 104.50 | 104.53 | 104.55 | 104.58 | 104.60 | 104.61 | 104.62 | 104.63 | 104.64 | 104.65 | 104.67 | 104.70 | 104.72 | 104.73 | 104.76 | 104.77 | 104.79 | 104.80 | 104.85 | 104.88 | 104.92 | 104.96 | 104.98 | 104.99 | 105.00 | 105.04 | 105.08 | 105.10 | 105.11 | 105.12 | 105.21 | 105.25 | 105.26 | 105.28 | 105.30 | 105.33 | 105.34 | 105.37 | 105.40 | 105.48 | 105.49 | 105.50 | 105.52 | 105.60 | 105.61 | 105.62 | 105.64 | 105.65 | 105.67 | 105.68 | 105.70 | 105.74 | 105.75 | 105.80 | 105.82 | 105.83 | 105.84 | 105.90 | 105.91 | 105.93 | 105.95 | 105.96 | 106.00 | 106.02 | 106.03 | 106.07 | 106.08 | 106.10 | 106.13 | 106.14 | 106.16 | 106.20 | 106.21 | 106.24 | 106.25 | 106.26 | 106.28 | 106.30 | 106.32 | 106.33 | 106.34 | 106.35 | 106.36 | 106.38 | 106.40 | 106.43 | 106.48 | 106.50 | 106.53 | 106.54 | 106.59 | 106.60 | 106.65 | 106.67 | 106.68 | 106.70 | 106.71 | 106.72 | 106.74 | 106.75 | 106.77 | 106.80 | 106.83 | 106.84 | 106.88 | 106.89 | 106.90 | 106.92 | 106.93 | 106.95 | 107.00 | 107.03 | 107.05 | 107.09 | 107.10 | 107.19 | 107.20 | 107.25 | 107.29 | 107.30 | 107.33 | 107.38 | 107.41 | 107.42 | 107.44 | 107.45 | 107.46 | 107.50 | 107.53 | 107.55 | 107.57 | 107.58 | 107.59 | 107.61 | 107.64 | 107.65 | 107.67 | 107.69 | 107.70 | 107.73 | 107.75 | 107.78 | 107.80 | 107.82 | 107.87 | 107.88 | 107.92 | 107.95 | 107.99 | 108.00 | 108.04 | 108.05 | 108.07 | 108.08 | 108.10 | 108.11 | 108.12 | 108.16 | 108.17 | 108.19 | 108.20 | 108.23 | 108.26 | 108.29 | 108.30 | 108.33 | 108.34 | 108.35 | 108.38 | 108.40 | 108.42 | 108.45 | 108.49 | 108.50 | 108.51 | 108.53 | 108.57 | 108.58 | 108.59 | 108.60 | 108.65 | 108.66 | 108.67 | 108.68 | 108.72 | 108.75 | 108.79 | 108.80 | 108.88 | 108.90 | 108.95 | 108.99 | 109.00 | 109.08 | 109.10 | 109.16 | 109.18 | 109.20 | 109.23 | 109.25 | 109.28 | 109.29 | 109.30 | 109.31 | 109.33 | 109.34 | 109.35 | 109.37 | 109.39 | 109.40 | 109.44 | 109.45 | 109.49 | 109.50 | 109.55 | 109.58 | 109.60 | 109.61 | 109.65 | 109.66 | 109.67 | 109.68 | 109.70 | 109.71 | 109.73 | 109.75 | 109.77 | 109.80 | 109.81 | 109.82 | 109.83 | 109.85 | 109.86 | 109.87 | 109.90 | 109.93 | 109.95 | 109.98 | 110.00 | 110.08 | 110.09 | 110.10 | 110.11 | 110.12 | 110.13 | 110.15 | 110.16 | 110.17 | 110.18 | 110.19 | 110.20 | 110.21 | 110.22 | 110.24 | 110.25 | 110.30 | 110.31 | 110.33 | 110.34 | 110.36 | 110.37 | 110.40 | 110.43 | 110.45 | 110.46 | 110.50 | 110.53 | 110.57 | 110.58 | 110.59 | 110.60 | 110.65 | 110.67 | 110.68 | 110.70 | 110.71 | 110.75 | 110.77 | 110.78 | 110.80 | 110.84 | 110.88 | 110.90 | 110.93 | 110.94 | 110.97 | 110.99 | 111.00 | 111.06 | 111.11 | 111.14 | 111.15 | 111.19 | 111.20 | 111.25 | 111.30 | 111.31 | 111.33 | 111.34 | 111.35 | 111.40 | 111.42 | 111.50 | 111.52 | 111.59 | 111.60 | 111.65 | 111.67 | 111.68 | 111.69 | 111.70 | 111.71 | 111.72 | 111.75 | 111.76 | 111.78 | 111.80 | 111.83 | 111.85 | 111.88 | 111.90 | 111.92 | 111.96 | 112.00 | 112.01 | 112.03 | 112.05 | 112.10 | 112.14 | 112.20 | 112.24 | 112.25 | 112.28 | 112.32 | 112.33 | 112.36 | 112.42 | 112.45 | 112.48 | 112.50 | 112.56 | 112.57 | 112.58 | 112.59 | 112.60 | 112.63 | 112.65 | 112.67 | 112.68 | 112.69 | 112.70 | 112.72 | 112.75 | 112.76 | 112.79 | 112.80 | 112.81 | 112.85 | 112.86 | 112.87 | 112.88 | 112.90 | 112.91 | 112.92 | 112.95 | 112.96 | 113.00 | 113.04 | 113.05 | 113.09 | 113.10 | 113.14 | 113.20 | 113.23 | 113.24 | 113.25 | 113.26 | 113.31 | 113.32 | 113.33 | 113.34 | 113.39 | 113.40 | 113.41 | 113.46 | 113.50 | 113.52 | 113.53 | 113.55 | 113.56 | 113.57 | 113.58 | 113.60 | 113.62 | 113.64 | 113.65 | 113.67 | 113.69 | 113.70 | 113.72 | 113.76 | 113.78 | 113.79 | 113.80 | 113.85 | 113.87 | 113.90 | 113.96 | 113.98 | 114.00 | 114.04 | 114.07 | 114.08 | 114.09 | 114.10 | 114.25 | 114.28 | 114.29 | 114.30 | 114.33 | 114.34 | 114.36 | 114.37 | 114.39 | 114.40 | 114.41 | 114.43 | 114.48 | 114.50 | 114.53 | 114.55 | 114.56 | 114.57 | 114.58 | 114.63 | 114.66 | 114.67 | 114.69 | 114.73 | 114.75 | 114.80 | 114.83 | 114.84 | 114.88 | 114.90 | 114.92 | 114.96 | 114.98 | 115.00 | 115.02 | 115.03 | 115.04 | 115.09 | 115.10 | 115.11 | 115.12 | 115.14 | 115.19 | 115.20 | 115.21 | 115.23 | 115.25 | 115.32 | 115.37 | 115.40 | 115.42 | 115.50 | 115.52 | 115.55 | 115.56 | 115.58 | 115.60 | 115.61 | 115.63 | 115.65 | 115.67 | 115.68 | 115.74 | 115.76 | 115.77 | 115.80 | 115.85 | 115.89 | 115.90 | 115.94 | 116.00 | 116.03 | 116.04 | 116.05 | 116.09 | 116.10 | 116.11 | 116.15 | 116.17 | 116.20 | 116.23 | 116.25 | 116.26 | 116.27 | 116.28 | 116.30 | 116.31 | 116.32 | 116.33 | 116.36 | 116.38 | 116.40 | 116.42 | 116.45 | 116.49 | 116.50 | 116.55 | 116.56 | 116.59 | 116.62 | 116.64 | 116.65 | 116.67 | 116.70 | 116.71 | 116.72 | 116.73 | 116.75 | 116.77 | 116.78 | 116.79 | 116.80 | 116.82 | 116.85 | 116.88 | 116.93 | 116.94 | 116.95 | 116.96 | 116.99 | 117.00 | 117.04 | 117.05 | 117.06 | 117.10 | 117.11 | 117.12 | 117.15 | 117.18 | 117.20 | 117.28 | 117.29 | 117.30 | 117.33 | 117.36 | 117.37 | 117.41 | 117.42 | 117.45 | 117.50 | 117.51 | 117.53 | 117.55 | 117.60 | 117.64 | 117.65 | 117.66 | 117.67 | 117.68 | 117.72 | 117.73 | 117.74 | 117.75 | 117.79 | 117.80 | 117.81 | 117.82 | 117.87 | 117.88 | 117.90 | 117.91 | 117.98 | 118.00 | 118.03 | 118.06 | 118.07 | 118.08 | 118.09 | 118.10 | 118.14 | 118.15 | 118.16 | 118.19 | 118.20 | 118.25 | 118.27 | 118.29 | 118.30 | 118.32 | 118.33 | 118.35 | 118.43 | 118.46 | 118.50 | 118.51 | 118.58 | 118.60 | 118.62 | 118.65 | 118.66 | 118.67 | 118.75 | 118.76 | 118.80 | 118.86 | 118.88 | 118.90 | 118.95 | 118.96 | 118.98 | 118.99 | 119.00 | 119.03 | 119.07 | 119.10 | 119.12 | 119.16 | 119.17 | 119.20 | 119.24 | 119.25 | 119.28 | 119.33 | 119.35 | 119.36 | 119.38 | 119.39 | 119.40 | 119.44 | 119.50 | 119.51 | 119.52 | 119.57 | 119.58 | 119.60 | 119.67 | 119.68 | 119.69 | 119.70 | 119.71 | 119.72 | 119.75 | 119.77 | 119.80 | 119.81 | 119.82 | 119.83 | 119.85 | 119.87 | 119.90 | 119.97 | 120.00 | 120.02 | 120.06 | 120.08 | 120.10 | 120.12 | 120.15 | 120.18 | 120.24 | 120.27 | 120.28 | 120.30 | 120.31 | 120.33 | 120.34 | 120.35 | 120.36 | 120.38 | 120.40 | 120.42 | 120.47 | 120.49 | 120.50 | 120.51 | 120.52 | 120.53 | 120.55 | 120.58 | 120.60 | 120.65 | 120.67 | 120.69 | 120.70 | 120.75 | 120.78 | 120.80 | 120.82 | 120.83 | 120.86 | 120.88 | 120.90 | 120.91 | 120.95 | 120.96 | 121.00 | 121.01 | 121.03 | 121.05 | 121.10 | 121.13 | 121.14 | 121.20 | 121.21 | 121.25 | 121.27 | 121.28 | 121.33 | 121.35 | 121.37 | 121.43 | 121.47 | 121.48 | 121.50 | 121.55 | 121.56 | 121.59 | 121.60 | 121.61 | 121.64 | 121.66 | 121.67 | 121.68 | 121.72 | 121.73 | 121.75 | 121.77 | 121.78 | 121.80 | 121.83 | 121.88 | 121.90 | 121.95 | 121.98 | 122.00 | 122.04 | 122.08 | 122.10 | 122.13 | 122.16 | 122.19 | 122.20 | 122.22 | 122.24 | 122.25 | 122.30 | 122.31 | 122.33 | 122.40 | 122.45 | 122.48 | 122.50 | 122.55 | 122.60 | 122.63 | 122.65 | 122.66 | 122.67 | 122.70 | 122.72 | 122.74 | 122.75 | 122.76 | 122.80 | 122.83 | 122.84 | 122.85 | 122.88 | 122.91 | 122.94 | 122.96 | 123.00 | 123.04 | 123.05 | 123.08 | 123.11 | 123.12 | 123.13 | 123.17 | 123.20 | 123.23 | 123.25 | 123.30 | 123.33 | 123.40 | 123.43 | 123.45 | 123.46 | 123.50 | 123.53 | 123.55 | 123.60 | 123.62 | 123.66 | 123.67 | 123.68 | 123.69 | 123.70 | 123.75 | 123.80 | 123.81 | 123.82 | 123.83 | 123.84 | 123.90 | 123.93 | 123.96 | 123.97 | 123.98 | 124.00 | 124.04 | 124.10 | 124.13 | 124.16 | 124.20 | 124.25 | 124.30 | 124.33 | 124.35 | 124.36 | 124.39 | 124.40 | 124.44 | 124.46 | 124.50 | 124.52 | 124.53 | 124.59 | 124.60 | 124.63 | 124.65 | 124.67 | 124.70 | 124.71 | 124.74 | 124.79 | 124.80 | 124.90 | 124.95 | 125.00 | 125.02 | 125.08 | 125.09 | 125.10 | 125.12 | 125.16 | 125.17 | 125.20 | 125.25 | 125.27 | 125.28 | 125.33 | 125.38 | 125.40 | 125.45 | 125.46 | 125.50 | 125.51 | 125.52 | 125.53 | 125.55 | 125.60 | 125.62 | 125.63 | 125.66 | 125.67 | 125.70 | 125.71 | 125.77 | 125.80 | 125.83 | 125.85 | 125.99 | 126.00 | 126.01 | 126.04 | 126.10 | 126.12 | 126.13 | 126.14 | 126.19 | 126.23 | 126.25 | 126.26 | 126.28 | 126.30 | 126.31 | 126.33 | 126.36 | 126.40 | 126.41 | 126.44 | 126.45 | 126.48 | 126.50 | 126.51 | 126.54 | 126.60 | 126.63 | 126.64 | 126.65 | 126.67 | 126.69 | 126.72 | 126.73 | 126.75 | 126.77 | 126.80 | 126.86 | 126.90 | 127.00 | 127.01 | 127.03 | 127.05 | 127.10 | 127.14 | 127.15 | 127.16 | 127.20 | 127.22 | 127.25 | 127.26 | 127.33 | 127.37 | 127.38 | 127.39 | 127.40 | 127.44 | 127.45 | 127.47 | 127.48 | 127.50 | 127.58 | 127.60 | 127.62 | 127.65 | 127.67 | 127.71 | 127.72 | 127.79 | 127.80 | 127.82 | 127.83 | 127.92 | 127.93 | 127.94 | 127.97 | 127.98 | 128.00 | 128.01 | 128.06 | 128.10 | 128.16 | 128.20 | 128.21 | 128.25 | 128.28 | 128.29 | 128.33 | 128.34 | 128.35 | 128.38 | 128.40 | 128.44 | 128.47 | 128.49 | 128.50 | 128.52 | 128.56 | 128.57 | 128.59 | 128.60 | 128.67 | 128.69 | 128.70 | 128.71 | 128.74 | 128.76 | 128.78 | 128.79 | 128.80 | 128.83 | 128.84 | 128.85 | 128.86 | 128.88 | 128.92 | 128.99 | 129.00 | 129.02 | 129.09 | 129.10 | 129.15 | 129.18 | 129.20 | 129.25 | 129.30 | 129.33 | 129.34 | 129.40 | 129.50 | 129.51 | 129.54 | 129.55 | 129.56 | 129.58 | 129.59 | 129.60 | 129.63 | 129.65 | 129.67 | 129.70 | 129.73 | 129.75 | 129.80 | 129.81 | 129.83 | 129.86 | 129.87 | 129.88 | 129.90 | 129.95 | 129.96 | 129.99 | 130.00 | 130.05 | 130.10 | 130.13 | 130.14 | 130.15 | 130.17 | 130.20 | 130.22 | 130.28 | 130.32 | 130.33 | 130.40 | 130.42 | 130.48 | 130.50 | 130.55 | 130.56 | 130.58 | 130.60 | 130.61 | 130.66 | 130.67 | 130.68 | 130.75 | 130.78 | 130.80 | 130.86 | 130.90 | 130.92 | 130.94 | 130.95 | 130.99 | 131.00 | 131.01 | 131.03 | 131.08 | 131.13 | 131.14 | 131.18 | 131.20 | 131.26 | 131.31 | 131.33 | 131.35 | 131.36 | 131.40 | 131.43 | 131.44 | 131.47 | 131.48 | 131.50 | 131.51 | 131.52 | 131.57 | 131.58 | 131.60 | 131.67 | 131.70 | 131.72 | 131.75 | 131.78 | 131.80 | 131.88 | 131.89 | 131.93 | 131.97 | 131.98 | 132.00 | 132.05 | 132.08 | 132.11 | 132.13 | 132.18 | 132.24 | 132.25 | 132.30 | 132.33 | 132.39 | 132.43 | 132.44 | 132.45 | 132.48 | 132.50 | 132.53 | 132.55 | 132.60 | 132.67 | 132.70 | 132.72 | 132.75 | 132.77 | 132.80 | 132.84 | 132.88 | 132.89 | 132.90 | 132.92 | 132.96 | 133.00 | 133.03 | 133.06 | 133.07 | 133.10 | 133.11 | 133.17 | 133.20 | 133.29 | 133.33 | 133.38 | 133.40 | 133.41 | 133.44 | 133.45 | 133.46 | 133.47 | 133.48 | 133.50 | 133.55 | 133.58 | 133.60 | 133.65 | 133.67 | 133.69 | 133.71 | 133.73 | 133.74 | 133.75 | 133.80 | 133.83 | 133.88 | 133.95 | 133.98 | 134.00 | 134.06 | 134.10 | 134.13 | 134.16 | 134.17 | 134.18 | 134.24 | 134.25 | 134.30 | 134.31 | 134.33 | 134.36 | 134.37 | 134.40 | 134.42 | 134.45 | 134.46 | 134.47 | 134.50 | 134.52 | 134.53 | 134.55 | 134.57 | 134.58 | 134.62 | 134.67 | 134.70 | 134.73 | 134.75 | 134.78 | 134.80 | 134.83 | 134.85 | 134.93 | 134.98 | 135.00 | 135.01 | 135.10 | 135.13 | 135.15 | 135.20 | 135.22 | 135.27 | 135.30 | 135.31 | 135.33 | 135.39 | 135.40 | 135.43 | 135.45 | 135.50 | 135.51 | 135.52 | 135.58 | 135.60 | 135.64 | 135.67 | 135.72 | 135.75 | 135.76 | 135.80 | 135.87 | 135.89 | 135.90 | 135.99 | 136.00 | 136.02 | 136.06 | 136.08 | 136.10 | 136.12 | 136.13 | 136.14 | 136.17 | 136.18 | 136.19 | 136.20 | 136.22 | 136.29 | 136.30 | 136.32 | 136.33 | 136.34 | 136.35 | 136.43 | 136.44 | 136.50 | 136.52 | 136.62 | 136.67 | 136.68 | 136.72 | 136.75 | 136.80 | 136.85 | 136.89 | 137.00 | 137.03 | 137.10 | 137.13 | 137.14 | 137.16 | 137.19 | 137.20 | 137.21 | 137.25 | 137.28 | 137.29 | 137.33 | 137.34 | 137.40 | 137.44 | 137.50 | 137.52 | 137.53 | 137.57 | 137.60 | 137.63 | 137.65 | 137.67 | 137.70 | 137.71 | 137.75 | 137.76 | 137.83 | 137.85 | 137.91 | 137.94 | 138.00 | 138.05 | 138.06 | 138.13 | 138.14 | 138.15 | 138.20 | 138.21 | 138.24 | 138.25 | 138.26 | 138.30 | 138.33 | 138.38 | 138.42 | 138.50 | 138.54 | 138.55 | 138.56 | 138.57 | 138.60 | 138.67 | 138.70 | 138.71 | 138.76 | 138.77 | 138.83 | 138.86 | 138.90 | 138.93 | 138.99 | 139.00 | 139.05 | 139.11 | 139.13 | 139.14 | 139.20 | 139.24 | 139.28 | 139.33 | 139.34 | 139.40 | 139.50 | 139.51 | 139.52 | 139.57 | 139.60 | 139.70 | 139.80 | 139.86 | 139.87 | 139.88 | 139.90 | 139.95 | 139.99 | 140.00 | 140.05 | 140.08 | 140.10 | 140.13 | 140.14 | 140.15 | 140.17 | 140.18 | 140.20 | 140.25 | 140.27 | 140.30 | 140.33 | 140.40 | 140.43 | 140.50 | 140.53 | 140.58 | 140.59 | 140.60 | 140.63 | 140.67 | 140.70 | 140.75 | 140.76 | 140.80 | 140.85 | 140.88 | 140.90 | 140.94 | 140.98 | 141.00 | 141.04 | 141.10 | 141.14 | 141.18 | 141.30 | 141.32 | 141.33 | 141.43 | 141.46 | 141.49 | 141.50 | 141.53 | 141.55 | 141.56 | 141.60 | 141.67 | 141.68 | 141.69 | 141.71 | 141.75 | 141.78 | 141.83 | 141.84 | 141.88 | 141.90 | 141.91 | 141.95 | 142.00 | 142.07 | 142.09 | 142.10 | 142.13 | 142.17 | 142.19 | 142.20 | 142.25 | 142.29 | 142.30 | 142.33 | 142.36 | 142.38 | 142.40 | 142.41 | 142.43 | 142.45 | 142.50 | 142.54 | 142.56 | 142.60 | 142.61 | 142.63 | 142.64 | 142.65 | 142.67 | 142.72 | 142.78 | 142.80 | 142.83 | 142.88 | 143.00 | 143.03 | 143.07 | 143.10 | 143.14 | 143.18 | 143.20 | 143.22 | 143.23 | 143.28 | 143.29 | 143.30 | 143.33 | 143.38 | 143.40 | 143.41 | 143.43 | 143.44 | 143.45 | 143.50 | 143.52 | 143.55 | 143.60 | 143.65 | 143.70 | 143.71 | 143.84 | 143.86 | 143.94 | 143.99 | 144.00 | 144.10 | 144.13 | 144.14 | 144.23 | 144.29 | 144.33 | 144.37 | 144.40 | 144.45 | 144.50 | 144.55 | 144.57 | 144.60 | 144.63 | 144.64 | 144.67 | 144.68 | 144.74 | 144.76 | 144.80 | 144.84 | 144.86 | 144.90 | 144.98 | 145.00 | 145.08 | 145.10 | 145.20 | 145.24 | 145.25 | 145.28 | 145.30 | 145.35 | 145.40 | 145.41 | 145.43 | 145.50 | 145.52 | 145.53 | 145.55 | 145.56 | 145.60 | 145.61 | 145.67 | 145.69 | 145.70 | 145.71 | 145.80 | 145.84 | 145.85 | 145.90 | 145.93 | 145.96 | 145.98 | 146.00 | 146.03 | 146.10 | 146.11 | 146.16 | 146.17 | 146.20 | 146.25 | 146.30 | 146.33 | 146.37 | 146.40 | 146.50 | 146.51 | 146.53 | 146.57 | 146.59 | 146.60 | 146.67 | 146.68 | 146.70 | 146.75 | 146.78 | 146.80 | 146.92 | 146.93 | 146.96 | 147.00 | 147.05 | 147.06 | 147.10 | 147.15 | 147.20 | 147.25 | 147.30 | 147.33 | 147.42 | 147.46 | 147.50 | 147.56 | 147.59 | 147.60 | 147.67 | 147.71 | 147.72 | 147.73 | 147.75 | 147.78 | 147.86 | 147.87 | 147.90 | 147.93 | 148.00 | 148.04 | 148.05 | 148.10 | 148.13 | 148.15 | 148.20 | 148.26 | 148.28 | 148.33 | 148.37 | 148.38 | 148.39 | 148.50 | 148.58 | 148.60 | 148.67 | 148.69 | 148.70 | 148.75 | 148.80 | 148.86 | 148.90 | 148.91 | 148.95 | 149.00 | 149.10 | 149.15 | 149.18 | 149.25 | 149.30 | 149.32 | 149.33 | 149.38 | 149.40 | 149.48 | 149.50 | 149.58 | 149.60 | 149.67 | 149.68 | 149.70 | 149.75 | 149.76 | 149.77 | 149.79 | 149.81 | 149.85 | 149.88 | 150.00 | 150.02 | 150.06 | 150.07 | 150.11 | 150.12 | 150.15 | 150.17 | 150.22 | 150.24 | 150.25 | 150.30 | 150.33 | 150.40 | 150.42 | 150.43 | 150.45 | 150.48 | 150.50 | 150.55 | 150.56 | 150.57 | 150.61 | 150.67 | 150.75 | 150.80 | 150.90 | 150.92 | 150.98 | 151.00 | 151.06 | 151.07 | 151.14 | 151.17 | 151.19 | 151.20 | 151.30 | 151.33 | 151.40 | 151.43 | 151.47 | 151.50 | 151.59 | 151.67 | 151.70 | 151.75 | 151.80 | 151.84 | 151.85 | 151.94 | 152.00 | 152.05 | 152.08 | 152.10 | 152.15 | 152.20 | 152.28 | 152.30 | 152.33 | 152.40 | 152.50 | 152.53 | 152.55 | 152.60 | 152.63 | 152.67 | 152.70 | 152.76 | 152.78 | 152.80 | 152.81 | 152.84 | 152.85 | 152.87 | 152.98 | 152.99 | 153.00 | 153.03 | 153.05 | 153.09 | 153.15 | 153.18 | 153.23 | 153.30 | 153.33 | 153.38 | 153.40 | 153.49 | 153.50 | 153.51 | 153.60 | 153.62 | 153.67 | 153.68 | 153.72 | 153.77 | 153.79 | 153.80 | 153.85 | 153.90 | 154.00 | 154.07 | 154.10 | 154.16 | 154.21 | 154.29 | 154.33 | 154.35 | 154.40 | 154.50 | 154.58 | 154.67 | 154.70 | 154.71 | 154.75 | 154.80 | 154.88 | 154.93 | 154.98 | 155.00 | 155.03 | 155.06 | 155.08 | 155.10 | 155.13 | 155.20 | 155.25 | 155.33 | 155.40 | 155.45 | 155.49 | 155.50 | 155.52 | 155.54 | 155.57 | 155.60 | 155.61 | 155.64 | 155.67 | 155.68 | 155.70 | 155.72 | 155.73 | 155.75 | 155.76 | 155.77 | 155.80 | 155.81 | 155.83 | 156.00 | 156.06 | 156.15 | 156.24 | 156.25 | 156.30 | 156.33 | 156.38 | 156.40 | 156.42 | 156.45 | 156.50 | 156.60 | 156.61 | 156.67 | 156.70 | 156.75 | 156.80 | 156.90 | 156.92 | 156.93 | 157.00 | 157.05 | 157.14 | 157.17 | 157.20 | 157.25 | 157.27 | 157.28 | 157.33 | 157.44 | 157.45 | 157.46 | 157.50 | 157.60 | 157.61 | 157.67 | 157.68 | 157.70 | 157.71 | 157.76 | 157.85 | 157.90 | 157.95 | 158.00 | 158.10 | 158.20 | 158.22 | 158.33 | 158.34 | 158.38 | 158.40 | 158.50 | 158.53 | 158.56 | 158.67 | 158.70 | 158.83 | 158.85 | 158.86 | 158.93 | 158.95 | 159.00 | 159.07 | 159.08 | 159.12 | 159.17 | 159.20 | 159.23 | 159.30 | 159.33 | 159.38 | 159.43 | 159.50 | 159.60 | 159.68 | 159.70 | 159.75 | 159.80 | 159.84 | 159.90 | 160.00 | 160.02 | 160.08 | 160.16 | 160.20 | 160.29 | 160.30 | 160.33 | 160.47 | 160.50 | 160.53 | 160.60 | 160.65 | 160.67 | 160.76 | 160.78 | 160.80 | 160.83 | 160.89 | 160.95 | 161.00 | 161.10 | 161.15 | 161.18 | 161.23 | 161.25 | 161.28 | 161.33 | 161.43 | 161.50 | 161.60 | 161.66 | 161.67 | 161.70 | 161.78 | 161.80 | 161.83 | 161.88 | 161.90 | 162.00 | 162.01 | 162.15 | 162.25 | 162.29 | 162.30 | 162.33 | 162.35 | 162.40 | 162.45 | 162.47 | 162.50 | 162.67 | 162.75 | 162.79 | 162.90 | 162.98 | 163.00 | 163.17 | 163.18 | 163.20 | 163.22 | 163.33 | 163.34 | 163.46 | 163.50 | 163.52 | 163.58 | 163.62 | 163.63 | 163.67 | 163.71 | 163.80 | 163.88 | 163.95 | 163.97 | 164.00 | 164.03 | 164.05 | 164.10 | 164.20 | 164.22 | 164.25 | 164.30 | 164.33 | 164.40 | 164.44 | 164.50 | 164.57 | 164.60 | 164.63 | 164.67 | 164.70 | 164.80 | 164.88 | 164.90 | 165.00 | 165.06 | 165.15 | 165.30 | 165.33 | 165.38 | 165.49 | 165.50 | 165.60 | 165.63 | 165.67 | 165.75 | 166.00 | 166.05 | 166.25 | 166.33 | 166.35 | 166.40 | 166.50 | 166.58 | 166.60 | 166.67 | 166.68 | 166.70 | 166.78 | 166.80 | 166.85 | 166.90 | 167.00 | 167.04 | 167.10 | 167.20 | 167.22 | 167.24 | 167.40 | 167.45 | 167.50 | 167.66 | 167.70 | 167.74 | 167.76 | 167.79 | 167.80 | 167.85 | 167.88 | 167.95 | 168.00 | 168.08 | 168.10 | 168.25 | 168.30 | 168.33 | 168.38 | 168.50 | 168.63 | 168.73 | 168.75 | 168.89 | 168.90 | 169.00 | 169.07 | 169.11 | 169.15 | 169.17 | 169.20 | 169.33 | 169.38 | 169.50 | 169.52 | 169.58 | 169.60 | 169.63 | 169.65 | 169.75 | 169.88 | 169.91 | 169.93 | 169.98 | 170.00 | 170.10 | 170.17 | 170.28 | 170.33 | 170.36 | 170.40 | 170.50 | 170.61 | 170.63 | 170.80 | 170.85 | 170.88 | 170.90 | 171.00 | 171.10 | 171.13 | 171.14 | 171.20 | 171.26 | 171.30 | 171.33 | 171.42 | 171.44 | 171.45 | 171.60 | 171.66 | 171.67 | 171.72 | 171.87 | 171.90 | 171.96 | 172.00 | 172.07 | 172.20 | 172.23 | 172.26 | 172.29 | 172.50 | 172.55 | 172.67 | 172.80 | 172.85 | 173.00 | 173.12 | 173.19 | 173.25 | 173.33 | 173.35 | 173.50 | 173.67 | 173.70 | 173.73 | 173.80 | 173.83 | 173.95 | 174.00 | 174.07 | 174.15 | 174.21 | 174.25 | 174.30 | 174.33 | 174.42 | 174.50 | 174.60 | 174.67 | 174.80 | 174.90 | 175.00 | 175.05 | 175.08 | 175.10 | 175.11 | 175.14 | 175.20 | 175.28 | 175.32 | 175.33 | 175.40 | 175.49 | 175.50 | 175.56 | 175.65 | 175.67 | 175.71 | 175.95 | 176.00 | 176.04 | 176.25 | 176.30 | 176.33 | 176.34 | 176.40 | 176.42 | 176.45 | 176.67 | 176.85 | 176.91 | 176.95 | 177.00 | 177.08 | 177.09 | 177.30 | 177.31 | 177.33 | 177.35 | 177.50 | 177.62 | 177.65 | 177.66 | 177.70 | 177.75 | 177.77 | 177.81 | 177.93 | 178.00 | 178.20 | 178.25 | 178.33 | 178.42 | 178.50 | 178.53 | 178.56 | 178.59 | 178.65 | 178.67 | 178.73 | 178.80 | 179.00 | 179.01 | 179.10 | 179.25 | 179.30 | 179.38 | 179.50 | 179.71 | 179.92 | 180.00 | 180.16 | 180.20 | 180.25 | 180.30 | 180.33 | 180.38 | 180.50 | 180.63 | 180.67 | 180.68 | 180.90 | 181.00 | 181.16 | 181.17 | 181.19 | 181.20 | 181.43 | 181.50 | 181.67 | 181.70 | 181.73 | 181.75 | 181.79 | 181.80 | 182.00 | 182.14 | 182.18 | 182.25 | 182.26 | 182.33 | 182.34 | 182.40 | 182.50 | 182.53 | 182.70 | 182.75 | 182.83 | 182.85 | 183.00 | 183.08 | 183.10 | 183.15 | 183.21 | 183.30 | 183.33 | 183.50 | 183.60 | 183.67 | 183.73 | 183.74 | 183.86 | 183.90 | 183.99 | 184.00 | 184.05 | 184.06 | 184.24 | 184.26 | 184.29 | 184.33 | 184.34 | 184.50 | 184.52 | 184.57 | 184.63 | 184.65 | 184.80 | 184.95 | 184.99 | 185.00 | 185.03 | 185.10 | 185.20 | 185.22 | 185.25 | 185.28 | 185.30 | 185.33 | 185.40 | 185.57 | 185.60 | 185.67 | 185.85 | 186.00 | 186.09 | 186.20 | 186.22 | 186.25 | 186.30 | 186.33 | 186.40 | 186.50 | 186.52 | 186.67 | 186.75 | 186.77 | 187.00 | 187.04 | 187.11 | 187.20 | 187.25 | 187.28 | 187.43 | 187.50 | 187.55 | 187.85 | 188.00 | 188.01 | 188.10 | 188.16 | 188.40 | 188.50 | 188.81 | 188.88 | 189.00 | 189.18 | 189.30 | 189.33 | 189.43 | 189.44 | 189.49 | 189.50 | 189.55 | 189.75 | 189.90 | 190.00 | 190.01 | 190.05 | 190.13 | 190.29 | 190.40 | 190.50 | 190.58 | 190.80 | 190.93 | 191.00 | 191.20 | 191.25 | 191.35 | 191.39 | 191.40 | 191.50 | 191.57 | 191.58 | 191.70 | 191.73 | 191.90 | 192.00 | 192.03 | 192.30 | 192.59 | 192.60 | 192.67 | 192.70 | 192.80 | 192.86 | 192.90 | 192.95 | 193.00 | 193.04 | 193.05 | 193.20 | 193.28 | 193.33 | 193.50 | 193.68 | 194.00 | 194.06 | 194.13 | 194.29 | 194.30 | 194.33 | 194.34 | 194.40 | 194.50 | 194.58 | 194.60 | 194.67 | 194.73 | 194.80 | 194.85 | 195.00 | 195.25 | 195.30 | 195.33 | 195.40 | 195.47 | 195.50 | 195.53 | 195.67 | 195.70 | 195.75 | 196.00 | 196.07 | 196.12 | 196.14 | 196.20 | 196.25 | 196.33 | 196.35 | 196.46 | 196.50 | 196.51 | 196.67 | 196.91 | 197.00 | 197.10 | 197.12 | 197.40 | 197.50 | 197.55 | 197.60 | 197.65 | 197.70 | 197.78 | 197.82 | 198.00 | 198.28 | 198.30 | 198.44 | 198.50 | 198.59 | 198.68 | 198.77 | 198.88 | 198.90 | 198.98 | 199.00 | 199.13 | 199.16 | 199.30 | 199.33 | 199.40 | 199.43 | 199.50 | 199.67 | 199.77 | 199.80 | 199.82 | 200.00 | 200.10 | 200.20 | 200.25 | 200.29 | 200.33 | 200.50 | 200.70 | 200.75 | 200.86 | 201.00 | 201.08 | 201.16 | 201.25 | 201.33 | 201.40 | 201.50 | 201.60 | 201.66 | 201.75 | 201.95 | 202.00 | 202.05 | 202.30 | 202.50 | 202.51 | 202.54 | 202.58 | 202.67 | 202.71 | 202.74 | 202.88 | 203.00 | 203.02 | 203.33 | 203.50 | 203.73 | 203.76 | 203.81 | 203.85 | 204.00 | 204.30 | 204.50 | 204.53 | 204.60 | 204.75 | 204.82 | 204.86 | 205.00 | 205.06 | 205.20 | 205.38 | 205.50 | 206.00 | 206.10 | 206.28 | 206.33 | 206.47 | 206.50 | 206.52 | 206.55 | 206.67 | 207.00 | 207.22 | 207.25 | 207.33 | 207.45 | 207.63 | 207.64 | 207.90 | 208.00 | 208.13 | 208.20 | 208.41 | 208.62 | 208.66 | 208.80 | 208.83 | 208.93 | 209.00 | 209.10 | 209.25 | 209.40 | 209.50 | 209.70 | 209.84 | 210.00 | 210.03 | 210.40 | 210.57 | 210.60 | 210.80 | 211.00 | 211.08 | 211.33 | 211.41 | 211.50 | 211.76 | 212.00 | 212.06 | 212.10 | 212.29 | 212.40 | 212.42 | 212.67 | 212.72 | 212.73 | 212.80 | 212.86 | 213.00 | 213.10 | 213.30 | 213.33 | 213.60 | 213.63 | 213.81 | 214.00 | 214.10 | 214.20 | 214.21 | 214.40 | 214.50 | 214.60 | 214.67 | 214.75 | 214.76 | 214.83 | 215.00 | 215.10 | 215.27 | 215.50 | 215.55 | 215.60 | 216.00 | 216.33 | 216.45 | 216.64 | 216.67 | 216.86 | 216.90 | 217.00 | 217.10 | 217.11 | 217.30 | 217.60 | 217.67 | 217.75 | 217.80 | 218.00 | 218.03 | 218.33 | 218.43 | 218.72 | 218.96 | 219.00 | 219.33 | 219.46 | 219.60 | 220.00 | 220.05 | 220.40 | 220.50 | 220.53 | 220.73 | 220.80 | 221.00 | 221.18 | 221.30 | 221.33 | 221.38 | 221.39 | 221.40 | 221.49 | 222.00 | 222.17 | 222.30 | 222.57 | 223.00 | 223.20 | 223.32 | 223.33 | 223.60 | 223.75 | 223.76 | 224.00 | 224.10 | 224.33 | 224.50 | 224.67 | 225.00 | 225.20 | 225.29 | 225.36 | 225.40 | 225.43 | 225.48 | 226.00 | 226.17 | 226.33 | 226.35 | 226.50 | 226.67 | 226.80 | 226.84 | 227.00 | 227.15 | 227.90 | 228.00 | 228.21 | 228.30 | 228.33 | 228.60 | 228.76 | 228.80 | 229.00 | 229.33 | 229.46 | 229.50 | 229.60 | 230.00 | 230.84 | 231.00 | 231.15 | 231.33 | 231.50 | 231.60 | 231.75 | 231.78 | 231.81 | 232.00 | 232.16 | 232.20 | 232.33 | 232.50 | 232.57 | 233.00 | 233.10 | 233.33 | 233.75 | 234.00 | 234.33 | 234.35 | 234.36 | 234.50 | 234.71 | 234.96 | 235.00 | 235.33 | 235.50 | 235.63 | 235.94 | 236.00 | 236.14 | 236.33 | 236.52 | 236.67 | 237.00 | 237.20 | 237.48 | 237.60 | 237.71 | 237.75 | 238.15 | 238.40 | 239.00 | 239.10 | 239.30 | 239.64 | 239.80 | 240.00 | 240.30 | 240.70 | 241.00 | 241.48 | 241.59 | 242.00 | 242.33 | 242.50 | 242.67 | 242.77 | 242.86 | 242.90 | 243.00 | 243.33 | 243.56 | 243.60 | 243.90 | 243.96 | 244.00 | 244.80 | 244.86 | 245.00 | 246.25 | 246.33 | 246.38 | 246.60 | 246.67 | 247.00 | 247.50 | 248.00 | 248.33 | 248.49 | 249.00 | 249.63 | 250.00 | 250.50 | 250.75 | 250.96 | 251.50 | 251.70 | 252.00 | 252.50 | 253.10 | 253.20 | 253.33 | 253.51 | 253.67 | 254.00 | 254.50 | 254.96 | 255.00 | 255.50 | 256.00 | 256.10 | 256.50 | 256.67 | 257.00 | 258.00 | 259.00 | 259.20 | 260.00 | 260.40 | 260.50 | 260.90 | 261.00 | 262.00 | 262.70 | 263.00 | 263.33 | 263.55 | 263.91 | 264.10 | 264.44 | 265.00 | 265.44 | 266.30 | 266.43 | 266.67 | 266.96 | 267.00 | 267.86 | 269.00 | 269.30 | 269.50 | 270.00 | 272.00 | 272.50 | 274.20 | 274.50 | 275.00 | 275.90 | 277.00 | 277.50 | 277.67 | 278.00 | 278.90 | 279.00 | 279.20 | 280.00 | 282.00 | 282.50 | 283.23 | 283.67 | 284.10 | 284.20 | 285.17 | 287.00 | 287.50 | 288.00 | 289.00 | 290.00 | 290.01 | 293.75 | 294.00 | 294.55 | 295.00 | 296.00 | 297.00 | 299.33 | 300.00 | 305.00 | 306.00 | 307.00 | 307.80 | 312.50 | 313.67 | 314.10 | 315.00 | 316.00 | 317.50 | 318.50 | 321.50 | 321.94 | 326.50 | 328.67 | 332.57 | 336.00 | 349.63 | 352.50 | 365.00 | 375.50 | 510.00 | 540.00 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| market_segment_type | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Aviation | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 78 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 74 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Complementary | 457 | 0 | 2 | 0 | 1 | 3 | 2 | 1 | 6 | 1 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Corporate | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 574 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 57 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 189 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 70 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 23 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 80 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 97 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 19 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 18 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 22 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 46 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 32 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 146 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 43 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 24 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 26 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 55 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 18 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Offline | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 2 | 1 | 1 | 1 | 0 | 6 | 1 | 1 | 3 | 3 | 1 | 1 | 16 | 3 | 0 | 0 | 3 | 1 | 11 | 1 | 4 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 5 | 10 | 0 | 1 | 0 | 0 | 1 | 0 | 2 | 21 | 1 | 3 | 0 | 1 | 0 | 2 | 1 | 4 | 0 | 2 | 0 | 0 | 7 | 0 | 0 | 0 | 1 | 3 | 1 | 0 | 1 | 1 | 1 | 0 | 5 | 0 | 5 | 3 | 0 | 0 | 0 | 0 | 6 | 0 | 4 | 0 | 0 | 3 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 4 | 1 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 13 | 0 | 0 | 0 | 0 | 3 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 5 | 1 | 0 | 0 | 26 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 2 | 0 | 0 | 11 | 0 | 2 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 62 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 | 0 | 0 | 3 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 27 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 164 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 3 | 0 | 0 | 0 | 2 | 2 | 0 | 0 | 1 | 0 | 0 | 0 | 8 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 2 | 0 | 0 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 20 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 3 | 0 | 1 | 0 | 2 | 2 | 0 | 0 | 10 | 0 | 0 | 0 | 23 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 8 | 0 | 0 | 42 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 13 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 | 0 | 0 | 3 | 0 | 2 | 0 | 0 | 0 | 1 | 1 | 7 | 0 | 0 | 106 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 | 3 | 0 | 0 | 1 | 1 | 0 | 0 | 2 | 0 | 0 | 0 | 1 | 24 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 21 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 146 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 3 | 0 | 13 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 2 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 | 0 | 0 | 2 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 27 | 0 | 35 | 0 | 2 | 0 | 0 | 0 | 2 | 1 | 1 | 6 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 | 0 | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 76 | 0 | 4 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 1 | 148 | 1 | 0 | 0 | 2 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 4 | 8 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 6 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 471 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 15 | 8 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 16 | 2 | 3 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 46 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 1 | 0 | 0 | 1 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 17 | 0 | 0 | 1 | 0 | 0 | 10 | 1 | 0 | 1 | 0 | 1 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 2 | 0 | 0 | 0 | 5 | 0 | 0 | 1 | 0 | 1 | 5 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 135 | 0 | 0 | 0 | 171 | 0 | 0 | 3 | 6 | 0 | 1 | 0 | 0 | 2 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 190 | 3 | 0 | 1 | 0 | 0 | 0 | 0 | 24 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 15 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 1 | 0 | 3 | 1 | 2 | 22 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 15 | 0 | 1 | 0 | 0 | 1 | 0 | 8 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 12 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 296 | 1 | 0 | 2 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 75 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 2 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 13 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 | 2 | 2 | 0 | 0 | 0 | 2 | 1 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 52 | 0 | 1 | 2 | 0 | 0 | 0 | 1 | 0 | 0 | 3 | 0 | 0 | 0 | 5 | 0 | 0 | 2 | 0 | 0 | 0 | 2 | 0 | 0 | 1 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 | 0 | 1 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 3 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 46 | 0 | 169 | 0 | 1 | 1 | 0 | 0 | 3 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 16 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 150 | 0 | 0 | 0 | 0 | 1 | 2 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 2 | 1 | 0 | 1 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 50 | 1 | 0 | 0 | 13 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 12 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 | 0 | 0 | 2 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 1 | 10 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 24 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 145 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 11 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 15 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 3 | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 1 | 43 | 3 | 0 | 0 | 0 | 0 | 8 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 7 | 0 | 34 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 2 | 0 | 0 | 1 | 0 | 0 | 2 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 56 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 2 | 67 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 6 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 2 | 1 | 0 | 0 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 1 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 63 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 1 | 1 | 0 | 0 | 9 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 27 | 0 | 0 | 0 | 10 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 12 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 29 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 58 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 11 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 41 | 0 | 0 | 0 | 5 | 0 | 7 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 4 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 3 | 0 | 0 | 1 | 0 | 0 | 0 | 2 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 86 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 11 | 0 | 1 | 0 | 0 | 19 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 13 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 62 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 5 | 0 | 0 | 0 | 8 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 32 | 0 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 43 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 2 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 12 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 3 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
| Online | 184 | 1 | 6 | 1 | 0 | 2 | 1 | 0 | 20 | 0 | 1 | 3 | 19 | 1 | 4 | 3 | 1 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 0 | 0 | 1 | 0 | 4 | 2 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 2 | 0 | 0 | 1 | 10 | 3 | 7 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 3 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 2 | 9 | 4 | 1 | 1 | 2 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 5 | 2 | 2 | 1 | 5 | 1 | 0 | 0 | 4 | 2 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 0 | 2 | 1 | 2 | 1 | 1 | 0 | 1 | 1 | 4 | 1 | 1 | 5 | 1 | 3 | 1 | 3 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 26 | 0 | 2 | 0 | 1 | 1 | 1 | 1 | 4 | 1 | 3 | 1 | 0 | 1 | 0 | 0 | 2 | 1 | 6 | 1 | 0 | 1 | 1 | 2 | 8 | 2 | 1 | 0 | 2 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 5 | 1 | 1 | 1 | 3 | 2 | 0 | 1 | 7 | 18 | 1 | 2 | 2 | 1 | 0 | 10 | 3 | 0 | 3 | 1 | 0 | 1 | 1 | 0 | 5 | 0 | 4 | 1 | 0 | 1 | 4 | 0 | 2 | 1 | 0 | 1 | 1 | 1 | 2 | 10 | 2 | 1 | 4 | 1 | 2 | 2 | 2 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 3 | 1 | 1 | 5 | 1 | 1 | 4 | 1 | 3 | 1 | 2 | 1 | 1 | 6 | 10 | 1 | 1 | 0 | 1 | 7 | 1 | 0 | 1 | 0 | 0 | 1 | 3 | 0 | 0 | 1 | 1 | 4 | 5 | 1 | 19 | 5 | 1 | 1 | 2 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 5 | 1 | 2 | 2 | 4 | 2 | 1 | 1 | 3 | 2 | 1 | 10 | 0 | 1 | 4 | 2 | 7 | 0 | 8 | 0 | 1 | 0 | 1 | 2 | 0 | 1 | 2 | 2 | 5 | 2 | 2 | 1 | 1 | 1 | 4 | 1 | 1 | 2 | 6 | 1 | 2 | 0 | 2 | 0 | 4 | 2 | 2 | 3 | 1 | 5 | 0 | 1 | 2 | 2 | 1 | 40 | 1 | 0 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 2 | 6 | 1 | 3 | 1 | 1 | 0 | 8 | 0 | 1 | 1 | 0 | 1 | 0 | 5 | 16 | 4 | 0 | 1 | 1 | 4 | 1 | 4 | 1 | 2 | 1 | 3 | 1 | 1 | 2 | 5 | 1 | 2 | 7 | 0 | 1 | 0 | 1 | 8 | 1 | 8 | 0 | 2 | 1 | 1 | 0 | 6 | 7 | 1 | 10 | 1 | 1 | 1 | 12 | 1 | 3 | 1 | 2 | 2 | 1 | 0 | 5 | 1 | 0 | 5 | 5 | 1 | 1 | 0 | 1 | 1 | 1 | 3 | 1 | 15 | 3 | 1 | 1 | 1 | 5 | 3 | 3 | 0 | 2 | 1 | 2 | 1 | 2 | 1 | 1 | 1 | 28 | 9 | 1 | 5 | 1 | 2 | 10 | 1 | 1 | 1 | 1 | 83 | 1 | 1 | 11 | 1 | 2 | 2 | 0 | 1 | 14 | 1 | 2 | 1 | 1 | 1 | 1 | 4 | 2 | 1 | 40 | 2 | 30 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 2 | 2 | 1 | 9 | 5 | 1 | 7 | 3 | 1 | 1 | 14 | 2 | 2 | 1 | 1 | 0 | 2 | 2 | 1 | 5 | 2 | 1 | 1 | 2 | 1 | 8 | 0 | 2 | 14 | 2 | 1 | 1 | 2 | 3 | 2 | 1 | 3 | 0 | 9 | 4 | 1 | 2 | 2 | 1 | 1 | 1 | 2 | 33 | 5 | 9 | 3 | 1 | 1 | 5 | 1 | 9 | 1 | 0 | 1 | 1 | 1 | 3 | 3 | 1 | 1 | 5 | 12 | 12 | 2 | 3 | 1 | 1 | 1 | 1 | 8 | 1 | 1 | 0 | 1 | 14 | 1 | 1 | 1 | 8 | 1 | 1 | 1 | 6 | 2 | 2 | 1 | 2 | 14 | 5 | 2 | 1 | 2 | 1 | 0 | 1 | 1 | 1 | 3 | 1 | 1 | 4 | 1 | 51 | 4 | 2 | 1 | 3 | 1 | 1 | 23 | 1 | 1 | 1 | 1 | 8 | 1 | 1 | 2 | 6 | 2 | 3 | 1 | 14 | 2 | 1 | 3 | 5 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 4 | 1 | 0 | 1 | 1 | 3 | 17 | 19 | 2 | 1 | 1 | 3 | 1 | 2 | 3 | 3 | 0 | 84 | 0 | 1 | 1 | 0 | 1 | 7 | 1 | 0 | 19 | 1 | 1 | 2 | 0 | 1 | 1 | 1 | 2 | 1 | 0 | 13 | 13 | 2 | 8 | 1 | 1 | 19 | 1 | 1 | 1 | 1 | 4 | 1 | 1 | 3 | 3 | 9 | 1 | 9 | 3 | 1 | 1 | 1 | 2 | 1 | 15 | 2 | 1 | 2 | 2 | 6 | 1 | 6 | 2 | 1 | 1 | 3 | 1 | 1 | 1 | 1 | 1 | 6 | 1 | 3 | 2 | 0 | 2 | 2 | 16 | 7 | 0 | 53 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 9 | 1 | 1 | 3 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 7 | 0 | 3 | 1 | 11 | 1 | 1 | 1 | 1 | 4 | 1 | 368 | 2 | 0 | 5 | 6 | 2 | 1 | 1 | 1 | 2 | 61 | 2 | 2 | 1 | 0 | 1 | 1 | 1 | 4 | 4 | 4 | 0 | 5 | 1 | 1 | 0 | 0 | 1 | 2 | 1 | 1 | 1 | 4 | 2 | 1 | 0 | 1 | 1 | 1 | 9 | 1 | 7 | 2 | 2 | 2 | 1 | 0 | 2 | 1 | 1 | 1 | 2 | 2 | 1 | 1 | 4 | 2 | 1 | 15 | 2 | 7 | 0 | 3 | 2 | 1 | 1 | 4 | 1 | 3 | 11 | 4 | 1 | 1 | 7 | 3 | 1 | 2 | 1 | 1 | 1 | 1 | 2 | 1 | 2 | 204 | 1 | 13 | 1 | 0 | 2 | 1 | 2 | 1 | 1 | 0 | 6 | 2 | 4 | 1 | 1 | 6 | 4 | 1 | 1 | 3 | 19 | 13 | 0 | 1 | 1 | 1 | 2 | 1 | 20 | 23 | 0 | 0 | 27 | 2 | 0 | 2 | 1 | 9 | 1 | 1 | 1 | 1 | 1 | 2 | 7 | 3 | 1 | 1 | 5 | 3 | 1 | 1 | 8 | 6 | 5 | 3 | 16 | 1 | 3 | 4 | 3 | 1 | 1 | 39 | 3 | 1 | 1 | 8 | 1 | 5 | 1 | 3 | 86 | 2 | 0 | 11 | 12 | 3 | 2 | 1 | 1 | 1 | 1 | 8 | 2 | 0 | 12 | 1 | 2 | 1 | 3 | 0 | 1 | 2 | 1 | 1 | 2 | 2 | 1 | 0 | 1 | 0 | 1 | 0 | 2 | 1 | 1 | 116 | 1 | 9 | 0 | 1 | 0 | 384 | 0 | 2 | 1 | 2 | 1 | 2 | 2 | 1 | 1 | 1 | 3 | 2 | 1 | 1 | 6 | 32 | 1 | 1 | 4 | 0 | 1 | 3 | 2 | 1 | 2 | 2 | 1 | 1 | 3 | 2 | 2 | 2 | 1 | 7 | 1 | 1 | 1 | 2 | 1 | 87 | 1 | 5 | 3 | 69 | 1 | 3 | 2 | 7 | 3 | 1 | 63 | 1 | 1 | 3 | 0 | 1 | 1 | 2 | 2 | 1 | 1 | 1 | 1 | 1 | 4 | 1 | 99 | 3 | 2 | 3 | 1 | 13 | 1 | 1 | 90 | 2 | 11 | 3 | 1 | 1 | 1 | 2 | 0 | 3 | 1 | 5 | 1 | 1 | 0 | 0 | 1 | 2 | 1 | 1 | 8 | 1 | 3 | 1 | 7 | 7 | 0 | 5 | 2 | 1 | 3 | 9 | 1 | 5 | 9 | 2 | 1 | 56 | 2 | 1 | 9 | 1 | 9 | 1 | 2 | 1 | 10 | 1 | 2 | 1 | 2 | 1 | 1 | 3 | 3 | 3 | 5 | 3 | 0 | 11 | 55 | 7 | 2 | 1 | 1 | 1 | 1 | 3 | 1 | 2 | 1 | 1 | 1 | 1 | 3 | 5 | 1 | 1 | 1 | 29 | 4 | 3 | 2 | 3 | 1 | 1 | 2 | 11 | 2 | 4 | 1 | 1 | 1 | 1 | 4 | 21 | 1 | 9 | 1 | 27 | 0 | 1 | 2 | 1 | 0 | 2 | 6 | 2 | 1 | 6 | 1 | 4 | 4 | 1 | 3 | 1 | 1 | 2 | 1 | 2 | 1 | 9 | 8 | 5 | 2 | 16 | 2 | 3 | 0 | 22 | 2 | 0 | 1 | 1 | 0 | 2 | 13 | 1 | 1 | 5 | 0 | 3 | 1 | 2 | 5 | 1 | 1 | 1 | 7 | 6 | 2 | 1 | 1 | 4 | 2 | 1 | 5 | 14 | 2 | 1 | 4 | 7 | 1 | 2 | 6 | 2 | 0 | 2 | 3 | 2 | 1 | 2 | 132 | 0 | 4 | 3 | 2 | 1 | 0 | 1 | 3 | 1 | 1 | 5 | 1 | 1 | 0 | 1 | 1 | 1 | 3 | 152 | 1 | 1 | 2 | 11 | 5 | 1 | 1 | 2 | 3 | 3 | 0 | 1 | 1 | 8 | 19 | 3 | 1 | 1 | 1 | 4 | 2 | 0 | 37 | 0 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 6 | 3 | 1 | 2 | 0 | 4 | 6 | 68 | 2 | 1 | 1 | 2 | 10 | 0 | 1 | 1 | 0 | 1 | 8 | 11 | 1 | 9 | 4 | 12 | 1 | 2 | 1 | 4 | 1 | 1 | 8 | 1 | 2 | 2 | 35 | 6 | 1 | 4 | 3 | 1 | 1 | 1 | 6 | 1 | 7 | 1 | 1 | 3 | 60 | 3 | 7 | 0 | 3 | 1 | 1 | 7 | 1 | 1 | 3 | 1 | 0 | 2 | 7 | 5 | 2 | 0 | 1 | 1 | 4 | 1 | 22 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 307 | 12 | 2 | 1 | 42 | 1 | 1 | 1 | 1 | 2 | 6 | 5 | 1 | 0 | 3 | 210 | 1 | 1 | 11 | 1 | 1 | 1 | 1 | 5 | 1 | 1 | 1 | 1 | 1 | 3 | 1 | 4 | 2 | 4 | 8 | 7 | 13 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 196 | 1 | 277 | 1 | 1 | 2 | 6 | 1 | 0 | 139 | 0 | 1 | 1 | 0 | 20 | 1 | 6 | 1 | 1 | 1 | 14 | 1 | 3 | 16 | 2 | 4 | 1 | 4 | 1 | 1 | 2 | 3 | 1 | 1 | 2 | 2 | 2 | 2 | 1 | 2 | 2 | 17 | 1 | 1 | 2 | 2 | 1 | 225 | 4 | 1 | 1 | 2 | 19 | 63 | 1 | 4 | 5 | 3 | 2 | 14 | 1 | 8 | 1 | 1 | 4 | 0 | 1 | 5 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 3 | 2 | 4 | 2 | 1 | 2 | 3 | 3 | 5 | 1 | 1 | 1 | 11 | 2 | 3 | 1 | 76 | 2 | 181 | 1 | 1 | 4 | 76 | 1 | 1 | 1 | 2 | 1 | 2 | 3 | 1 | 2 | 15 | 1 | 2 | 1 | 1 | 2 | 2 | 1 | 7 | 2 | 14 | 2 | 1 | 3 | 6 | 4 | 17 | 12 | 7 | 1 | 2 | 1 | 15 | 7 | 3 | 2 | 2 | 0 | 1 | 1 | 0 | 1 | 134 | 1 | 1 | 1 | 6 | 1 | 0 | 2 | 3 | 1 | 1 | 1 | 30 | 1 | 2 | 0 | 2 | 1 | 1 | 1 | 2 | 14 | 1 | 1 | 1 | 3 | 11 | 2 | 3 | 1 | 1 | 20 | 2 | 1 | 5 | 1 | 1 | 1 | 3 | 2 | 1 | 2 | 15 | 1 | 5 | 3 | 1 | 2 | 1 | 1 | 3 | 5 | 1 | 4 | 1 | 3 | 2 | 1 | 1 | 6 | 2 | 5 | 1 | 67 | 1 | 1 | 16 | 1 | 1 | 2 | 6 | 1 | 2 | 2 | 0 | 1 | 6 | 2 | 1 | 1 | 2 | 2 | 1 | 3 | 25 | 2 | 3 | 0 | 202 | 2 | 4 | 6 | 2 | 1 | 1 | 1 | 2 | 5 | 3 | 1 | 1 | 2 | 1 | 2 | 4 | 2 | 1 | 23 | 9 | 1 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 2 | 3 | 2 | 4 | 4 | 24 | 5 | 7 | 1 | 234 | 2 | 1 | 2 | 3 | 3 | 2 | 6 | 1 | 1 | 1 | 3 | 15 | 2 | 7 | 0 | 1 | 1 | 1 | 8 | 1 | 1 | 3 | 123 | 1 | 3 | 1 | 7 | 1 | 5 | 1 | 1 | 37 | 2 | 2 | 2 | 1 | 1 | 2 | 1 | 1 | 60 | 7 | 1 | 13 | 1 | 5 | 1 | 2 | 1 | 1 | 0 | 1 | 5 | 1 | 1 | 1 | 2 | 9 | 1 | 1 | 8 | 2 | 2 | 2 | 1 | 3 | 2 | 0 | 1 | 2 | 1 | 3 | 1 | 82 | 1 | 14 | 2 | 2 | 3 | 3 | 0 | 1 | 4 | 2 | 2 | 6 | 1 | 1 | 237 | 1 | 6 | 1 | 5 | 1 | 1 | 1 | 3 | 2 | 1 | 0 | 7 | 2 | 2 | 0 | 2 | 2 | 2 | 1 | 1 | 1 | 0 | 71 | 4 | 1 | 1 | 1 | 4 | 87 | 1 | 37 | 1 | 2 | 1 | 1 | 1 | 98 | 1 | 2 | 1 | 1 | 10 | 2 | 1 | 10 | 3 | 1 | 1 | 1 | 20 | 12 | 2 | 3 | 4 | 1 | 1 | 3 | 3 | 2 | 2 | 17 | 35 | 1 | 1 | 1 | 1 | 1 | 1 | 5 | 1 | 1 | 1 | 158 | 1 | 83 | 2 | 2 | 1 | 0 | 4 | 1 | 2 | 1 | 12 | 0 | 2 | 3 | 0 | 3 | 2 | 3 | 10 | 3 | 1 | 1 | 1 | 3 | 2 | 1 | 23 | 1 | 3 | 1 | 1 | 1 | 8 | 3 | 1 | 4 | 4 | 1 | 2 | 4 | 1 | 434 | 2 | 2 | 9 | 1 | 2 | 1 | 2 | 2 | 9 | 1 | 2 | 3 | 4 | 18 | 7 | 1 | 1 | 1 | 152 | 1 | 4 | 1 | 4 | 2 | 0 | 2 | 3 | 2 | 2 | 2 | 3 | 1 | 1 | 2 | 1 | 5 | 3 | 1 | 2 | 8 | 64 | 8 | 1 | 60 | 1 | 1 | 1 | 1 | 9 | 1 | 1 | 1 | 5 | 0 | 1 | 1 | 149 | 2 | 1 | 1 | 2 | 1 | 3 | 2 | 5 | 7 | 2 | 1 | 1 | 3 | 2 | 2 | 2 | 1 | 4 | 4 | 1 | 7 | 1 | 3 | 46 | 0 | 1 | 7 | 1 | 1 | 1 | 1 | 68 | 2 | 2 | 1 | 1 | 0 | 4 | 2 | 14 | 5 | 1 | 1 | 10 | 2 | 2 | 8 | 3 | 1 | 1 | 4 | 6 | 1 | 2 | 1 | 1 | 2 | 3 | 3 | 3 | 1 | 1 | 1 | 13 | 1 | 1 | 4 | 11 | 4 | 1 | 4 | 1 | 2 | 1 | 2 | 2 | 1 | 3 | 3 | 2 | 1 | 65 | 3 | 1 | 3 | 0 | 0 | 5 | 4 | 1 | 1 | 2 | 13 | 7 | 1 | 2 | 1 | 1 | 1 | 1 | 5 | 14 | 1 | 1 | 1 | 5 | 2 | 70 | 0 | 3 | 1 | 2 | 0 | 1 | 0 | 1 | 5 | 1 | 21 | 2 | 3 | 2 | 1 | 1 | 35 | 28 | 1 | 2 | 2 | 1 | 1 | 1 | 1 | 6 | 0 | 3 | 1 | 1 | 1 | 0 | 3 | 2 | 1 | 1 | 1 | 1 | 1 | 2 | 2 | 110 | 1 | 1 | 2 | 1 | 0 | 3 | 1 | 1 | 1 | 3 | 1 | 3 | 41 | 1 | 1 | 1 | 1 | 0 | 11 | 1 | 6 | 1 | 3 | 1 | 13 | 1 | 107 | 1 | 6 | 1 | 3 | 1 | 1 | 1 | 1 | 1 | 2 | 2 | 2 | 3 | 1 | 1 | 1 | 1 | 79 | 1 | 1 | 1 | 6 | 12 | 1 | 20 | 1 | 0 | 1 | 1 | 1 | 4 | 1 | 1 | 1 | 9 | 2 | 4 | 1 | 1 | 1 | 7 | 1 | 1 | 2 | 1 | 1 | 102 | 1 | 1 | 1 | 1 | 1 | 1 | 7 | 1 | 3 | 221 | 5 | 4 | 1 | 28 | 1 | 1 | 6 | 1 | 5 | 7 | 1 | 4 | 1 | 1 | 1 | 1 | 2 | 7 | 1 | 1 | 8 | 4 | 10 | 0 | 0 | 1 | 1 | 73 | 1 | 4 | 2 | 1 | 1 | 2 | 2 | 1 | 113 | 1 | 1 | 26 | 1 | 7 | 2 | 1 | 9 | 1 | 3 | 1 | 1 | 10 | 1 | 1 | 8 | 2 | 3 | 4 | 3 | 9 | 4 | 6 | 1 | 3 | 3 | 2 | 7 | 1 | 4 | 1 | 1 | 3 | 3 | 1 | 2 | 1 | 1 | 65 | 8 | 2 | 1 | 299 | 1 | 4 | 2 | 1 | 1 | 3 | 1 | 3 | 2 | 1 | 2 | 1 | 5 | 4 | 7 | 1 | 1 | 1 | 1 | 3 | 1 | 5 | 0 | 2 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 2 | 33 | 1 | 431 | 2 | 1 | 1 | 1 | 1 | 10 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 2 | 15 | 3 | 1 | 1 | 6 | 1 | 3 | 14 | 1 | 5 | 1 | 1 | 5 | 2 | 1 | 1 | 1 | 2 | 8 | 1 | 1 | 2 | 1 | 43 | 1 | 46 | 5 | 4 | 149 | 2 | 8 | 1 | 0 | 6 | 4 | 7 | 2 | 3 | 4 | 2 | 5 | 1 | 5 | 1 | 1 | 2 | 4 | 1 | 3 | 2 | 2 | 3 | 2 | 1 | 38 | 2 | 2 | 1 | 1 | 1 | 2 | 3 | 1 | 168 | 3 | 1 | 4 | 2 | 1 | 3 | 1 | 1 | 1 | 1 | 61 | 8 | 1 | 3 | 3 | 1 | 1 | 2 | 4 | 1 | 1 | 2 | 4 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 2 | 1 | 13 | 1 | 1 | 5 | 18 | 1 | 1 | 1 | 1 | 1 | 2 | 2 | 2 | 23 | 4 | 2 | 3 | 3 | 8 | 3 | 3 | 1 | 1 | 1 | 2 | 2 | 64 | 1 | 2 | 1 | 2 | 1 | 6 | 1 | 8 | 1 | 9 | 1 | 20 | 3 | 2 | 2 | 2 | 1 | 55 | 5 | 5 | 1 | 1 | 1 | 1 | 1 | 6 | 1 | 7 | 1 | 3 | 1 | 1 | 3 | 4 | 0 | 57 | 7 | 3 | 8 | 5 | 1 | 39 | 1 | 3 | 1 | 3 | 4 | 1 | 6 | 0 | 1 | 110 | 2 | 1 | 3 | 9 | 0 | 1 | 1 | 12 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | 10 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 2 | 1 | 16 | 0 | 9 | 1 | 5 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 8 | 1 | 2 | 50 | 2 | 1 | 5 | 1 | 1 | 3 | 0 | 2 | 1 | 5 | 3 | 1 | 1 | 2 | 1 | 5 | 1 | 1 | 1 | 2 | 1 | 3 | 2 | 5 | 1 | 2 | 105 | 3 | 2 | 1 | 1 | 1 | 1 | 2 | 1 | 131 | 6 | 2 | 1 | 1 | 2 | 5 | 1 | 1 | 3 | 12 | 1 | 2 | 1 | 1 | 1 | 4 | 2 | 5 | 2 | 5 | 32 | 5 | 1 | 1 | 1 | 12 | 1 | 1 | 1 | 149 | 1 | 2 | 7 | 2 | 1 | 1 | 1 | 1 | 1 | 39 | 2 | 1 | 2 | 3 | 2 | 1 | 1 | 12 | 2 | 2 | 1 | 2 | 34 | 0 | 5 | 1 | 2 | 5 | 1 | 1 | 1 | 7 | 2 | 1 | 1 | 2 | 41 | 1 | 0 | 1 | 1 | 208 | 1 | 1 | 1 | 2 | 1 | 2 | 1 | 5 | 2 | 1 | 1 | 1 | 6 | 2 | 1 | 3 | 1 | 14 | 3 | 2 | 3 | 1 | 1 | 4 | 1 | 1 | 6 | 3 | 1 | 1 | 2 | 5 | 1 | 1 | 0 | 4 | 1 | 1 | 2 | 1 | 1 | 5 | 1 | 1 | 333 | 8 | 1 | 1 | 3 | 2 | 1 | 1 | 2 | 1 | 1 | 2 | 33 | 3 | 1 | 1 | 2 | 1 | 6 | 6 | 2 | 1 | 1 | 6 | 1 | 1 | 1 | 9 | 2 | 1 | 4 | 1 | 1 | 0 | 2 | 5 | 1 | 1 | 1 | 66 | 1 | 1 | 119 | 1 | 0 | 2 | 4 | 1 | 2 | 1 | 13 | 1 | 1 | 7 | 2 | 1 | 3 | 1 | 0 | 6 | 2 | 1 | 1 | 7 | 1 | 8 | 2 | 1 | 2 | 1 | 5 | 7 | 1 | 160 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 175 | 1 | 5 | 7 | 2 | 3 | 3 | 6 | 0 | 18 | 3 | 6 | 2 | 1 | 1 | 1 | 3 | 2 | 9 | 1 | 2 | 2 | 1 | 1 | 2 | 1 | 1 | 19 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 3 | 72 | 1 | 0 | 2 | 328 | 1 | 4 | 1 | 1 | 7 | 6 | 1 | 1 | 1 | 4 | 2 | 1 | 2 | 2 | 3 | 2 | 1 | 1 | 1 | 1 | 0 | 5 | 1 | 1 | 1 | 0 | 1 | 47 | 1 | 2 | 4 | 27 | 0 | 2 | 3 | 2 | 5 | 1 | 1 | 6 | 1 | 1 | 2 | 40 | 1 | 1 | 5 | 1 | 2 | 2 | 3 | 1 | 1 | 1 | 2 | 22 | 1 | 6 | 3 | 1 | 1 | 135 | 12 | 1 | 2 | 10 | 1 | 1 | 2 | 4 | 3 | 2 | 1 | 5 | 1 | 1 | 9 | 1 | 4 | 1 | 1 | 2 | 33 | 3 | 1 | 3 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 5 | 75 | 4 | 3 | 8 | 1 | 1 | 2 | 0 | 1 | 5 | 10 | 1 | 1 | 0 | 3 | 1 | 3 | 1 | 2 | 1 | 1 | 1 | 1 | 31 | 1 | 2 | 1 | 2 | 2 | 1 | 2 | 7 | 1 | 12 | 111 | 6 | 0 | 2 | 1 | 1 | 13 | 1 | 1 | 1 | 2 | 1 | 3 | 3 | 1 | 0 | 18 | 2 | 1 | 2 | 1 | 4 | 3 | 1 | 1 | 1 | 5 | 96 | 2 | 17 | 1 | 1 | 15 | 9 | 3 | 3 | 1 | 2 | 1 | 2 | 2 | 4 | 8 | 1 | 3 | 1 | 3 | 1 | 4 | 10 | 1 | 1 | 1 | 1 | 8 | 1 | 2 | 159 | 1 | 1 | 1 | 184 | 1 | 1 | 1 | 2 | 2 | 1 | 1 | 7 | 3 | 6 | 1 | 2 | 1 | 1 | 1 | 1 | 8 | 2 | 1 | 1 | 3 | 4 | 2 | 1 | 1 | 36 | 1 | 1 | 1 | 520 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 5 | 1 | 7 | 3 | 2 | 1 | 1 | 1 | 3 | 2 | 1 | 1 | 5 | 1 | 1 | 2 | 24 | 1 | 2 | 1 | 2 | 1 | 1 | 2 | 182 | 63 | 1 | 1 | 1 | 6 | 1 | 2 | 1 | 5 | 2 | 2 | 1 | 2 | 1 | 6 | 2 | 4 | 2 | 1 | 3 | 1 | 15 | 3 | 6 | 4 | 2 | 6 | 4 | 1 | 1 | 76 | 1 | 2 | 1 | 8 | 1 | 1 | 1 | 100 | 1 | 1 | 4 | 1 | 2 | 1 | 3 | 1 | 1 | 3 | 1 | 1 | 2 | 9 | 2 | 1 | 2 | 8 | 1 | 1 | 2 | 1 | 3 | 1 | 2 | 45 | 1 | 1 | 0 | 1 | 2 | 5 | 1 | 1 | 5 | 1 | 1 | 2 | 1 | 149 | 1 | 1 | 2 | 1 | 1 | 15 | 1 | 1 | 7 | 1 | 1 | 14 | 1 | 1 | 1 | 1 | 2 | 2 | 52 | 1 | 3 | 3 | 8 | 2 | 1 | 0 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 4 | 1 | 181 | 53 | 0 | 2 | 1 | 1 | 1 | 3 | 1 | 1 | 0 | 9 | 2 | 1 | 2 | 270 | 1 | 1 | 0 | 2 | 1 | 1 | 16 | 1 | 1 | 3 | 7 | 1 | 28 | 1 | 1 | 1 | 1 | 40 | 1 | 2 | 0 | 1 | 2 | 1 | 7 | 2 | 1 | 2 | 4 | 1 | 71 | 2 | 1 | 3 | 1 | 4 | 1 | 1 | 1 | 2 | 4 | 1 | 10 | 2 | 10 | 1 | 3 | 1 | 1 | 1 | 0 | 1 | 25 | 3 | 2 | 1 | 3 | 7 | 1 | 2 | 71 | 8 | 1 | 2 | 2 | 1 | 2 | 8 | 2 | 1 | 62 | 4 | 1 | 1 | 1 | 2 | 0 | 1 | 4 | 1 | 7 | 1 | 1 | 47 | 1 | 1 | 1 | 3 | 2 | 2 | 30 | 2 | 23 | 2 | 1 | 1 | 1 | 4 | 1 | 1 | 2 | 18 | 1 | 1 | 5 | 5 | 1 | 1 | 1 | 1 | 3 | 1 | 2 | 1 | 1 | 2 | 4 | 34 | 1 | 59 | 1 | 1 | 1 | 1 | 1 | 1 | 38 | 1 | 4 | 2 | 2 | 3 | 1 | 2 | 2 | 1 | 10 | 1 | 1 | 2 | 1 | 2 | 1 | 1 | 2 | 3 | 5 | 2 | 5 | 3 | 1 | 1 | 1 | 261 | 1 | 1 | 1 | 15 | 3 | 1 | 2 | 1 | 1 | 9 | 2 | 1 | 1 | 4 | 8 | 1 | 1 | 1 | 1 | 1 | 4 | 1 | 1 | 1 | 3 | 1 | 1 | 147 | 1 | 82 | 1 | 1 | 1 | 4 | 1 | 1 | 1 | 1 | 1 | 1 | 2 | 1 | 2 | 1 | 1 | 1 | 1 | 2 | 2 | 1 | 7 | 1 | 1 | 8 | 0 | 1 | 2 | 40 | 4 | 1 | 34 | 1 | 4 | 2 | 3 | 1 | 1 | 2 | 2 | 22 | 1 | 1 | 3 | 2 | 5 | 1 | 4 | 4 | 0 | 6 | 2 | 1 | 1 | 8 | 141 | 1 | 5 | 1 | 2 | 2 | 2 | 1 | 64 | 2 | 2 | 2 | 1 | 3 | 1 | 1 | 1 | 2 | 1 | 1 | 8 | 1 | 1 | 5 | 1 | 14 | 1 | 3 | 53 | 4 | 2 | 1 | 1 | 1 | 4 | 1 | 3 | 1 | 2 | 167 | 4 | 1 | 2 | 3 | 7 | 1 | 4 | 5 | 1 | 6 | 208 | 1 | 1 | 1 | 4 | 2 | 2 | 1 | 1 | 1 | 1 | 13 | 1 | 360 | 1 | 1 | 1 | 2 | 2 | 1 | 1 | 1 | 1 | 4 | 5 | 1 | 2 | 141 | 1 | 1 | 1 | 1 | 1 | 3 | 1 | 1 | 14 | 2 | 2 | 4 | 1 | 1 | 2 | 1 | 2 | 66 | 1 | 2 | 3 | 1 | 32 | 1 | 4 | 1 | 2 | 1 | 5 | 1 | 2 | 2 | 3 | 0 | 3 | 1 | 1 | 4 | 1 | 1 | 2 | 1 | 2 | 1 | 4 | 31 | 2 | 1 | 1 | 1 | 1 | 1 | 75 | 0 | 1 | 1 | 6 | 1 | 2 | 1 | 1 | 1 | 2 | 5 | 1 | 1 | 2 | 1 | 1 | 2 | 1 | 6 | 1 | 1 | 6 | 1 | 3 | 17 | 1 | 1 | 73 | 1 | 2 | 3 | 1 | 3 | 4 | 1 | 1 | 4 | 3 | 4 | 1 | 1 | 1 | 1 | 8 | 1 | 5 | 3 | 2 | 9 | 1 | 1 | 1 | 1 | 1 | 214 | 1 | 1 | 3 | 2 | 1 | 6 | 2 | 7 | 3 | 16 | 1 | 1 | 1 | 2 | 1 | 7 | 1 | 1 | 1 | 4 | 1 | 1 | 20 | 1 | 85 | 1 | 4 | 6 | 1 | 1 | 1 | 1 | 4 | 2 | 1 | 1 | 9 | 1 | 2 | 1 | 1 | 1 | 1 | 5 | 2 | 1 | 1 | 22 | 1 | 2 | 1 | 1 | 1 | 1 | 39 | 1 | 3 | 1 | 1 | 1 | 10 | 3 | 1 | 3 | 1 | 4 | 3 | 1 | 0 | 1 | 1 | 2 | 27 | 1 | 65 | 1 | 1 | 1 | 4 | 2 | 4 | 32 | 1 | 1 | 1 | 1 | 2 | 5 | 2 | 3 | 1 | 1 | 8 | 0 | 2 | 14 | 4 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 38 | 1 | 54 | 1 | 1 | 1 | 1 | 1 | 3 | 1 | 4 | 1 | 1 | 1 | 2 | 106 | 2 | 1 | 3 | 1 | 1 | 2 | 3 | 1 | 1 | 2 | 3 | 123 | 6 | 1 | 1 | 1 | 1 | 0 | 11 | 1 | 152 | 1 | 7 | 1 | 11 | 2 | 1 | 3 | 1 | 1 | 2 | 2 | 1 | 1 | 1 | 208 | 1 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 2 | 1 | 77 | 2 | 1 | 1 | 1 | 32 | 1 | 2 | 1 | 1 | 0 | 1 | 0 | 2 | 2 | 2 | 1 | 2 | 62 | 1 | 1 | 1 | 1 | 1 | 75 | 3 | 5 | 1 | 2 | 2 | 6 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 40 | 1 | 1 | 111 | 5 | 1 | 1 | 2 | 4 | 3 | 2 | 2 | 3 | 1 | 2 | 9 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 130 | 1 | 1 | 2 | 1 | 1 | 2 | 5 | 13 | 1 | 1 | 1 | 6 | 1 | 2 | 3 | 3 | 1 | 1 | 1 | 1 | 0 | 1 | 11 | 41 | 1 | 2 | 1 | 1 | 1 | 5 | 3 | 2 | 6 | 1 | 4 | 4 | 1 | 1 | 10 | 1 | 4 | 1 | 97 | 1 | 1 | 1 | 0 | 1 | 1 | 6 | 2 | 1 | 1 | 1 | 1 | 3 | 1 | 2 | 1 | 1 | 1 | 1 | 2 | 12 | 2 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 72 | 2 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 5 | 61 | 3 | 10 | 1 | 1 | 0 | 3 | 1 | 1 | 17 | 1 | 3 | 1 | 1 | 14 | 1 | 1 | 1 | 1 | 1 | 1 | 45 | 1 | 1 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 50 | 2 | 1 | 1 | 2 | 1 | 1 | 36 | 3 | 1 | 1 | 2 | 2 | 1 | 2 | 1 | 1 | 13 | 74 | 0 | 1 | 0 | 1 | 2 | 1 | 114 | 10 | 3 | 1 | 5 | 5 | 1 | 2 | 4 | 3 | 0 | 3 | 302 | 1 | 2 | 1 | 21 | 1 | 1 | 1 | 1 | 4 | 1 | 0 | 2 | 1 | 1 | 0 | 1 | 1 | 1 | 2 | 24 | 27 | 0 | 1 | 1 | 1 | 1 | 10 | 1 | 11 | 1 | 1 | 0 | 2 | 2 | 0 | 1 | 1 | 1 | 162 | 1 | 1 | 1 | 2 | 6 | 6 | 4 | 4 | 4 | 1 | 4 | 8 | 1 | 2 | 6 | 1 | 11 | 1 | 0 | 7 | 1 | 16 | 1 | 1 | 2 | 1 | 1 | 0 | 4 | 4 | 1 | 18 | 1 | 2 | 1 | 35 | 2 | 3 | 0 | 1 | 1 | 11 | 1 | 3 | 2 | 2 | 5 | 3 | 1 | 1 | 1 | 21 | 3 | 1 | 4 | 32 | 2 | 1 | 2 | 2 | 1 | 1 | 1 | 13 | 1 | 4 | 8 | 44 | 3 | 2 | 2 | 1 | 1 | 64 | 1 | 1 | 5 | 1 | 1 | 1 | 1 | 2 | 1 | 21 | 1 | 2 | 3 | 0 | 1 | 8 | 13 | 2 | 1 | 3 | 1 | 3 | 1 | 1 | 3 | 1 | 4 | 29 | 2 | 1 | 1 | 130 | 1 | 1 | 4 | 2 | 1 | 3 | 1 | 1 | 69 | 1 | 1 | 1 | 1 | 13 | 3 | 0 | 6 | 1 | 1 | 1 | 1 | 2 | 1 | 2 | 0 | 1 | 0 | 149 | 10 | 1 | 1 | 3 | 5 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 108 | 1 | 2 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 2 | 0 | 1 | 1 | 1 | 7 | 1 | 7 | 1 | 1 | 2 | 1 | 1 | 7 | 1 | 2 | 7 | 1 | 6 | 1 | 1 | 16 | 3 | 0 | 2 | 4 | 10 | 1 | 2 | 1 | 1 | 52 | 1 | 1 | 1 | 3 | 3 | 0 | 1 | 1 | 12 | 2 | 1 | 2 | 23 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 99 | 2 | 1 | 1 | 2 | 2 | 19 | 0 | 1 | 1 | 3 | 1 | 5 | 1 | 1 | 3 | 1 | 1 | 1 | 41 | 2 | 1 | 61 | 1 | 2 | 1 | 5 | 1 | 1 | 1 | 1 | 8 | 1 | 1 | 1 | 13 | 10 | 1 | 1 | 1 | 7 | 1 | 2 | 1 | 1 | 2 | 1 | 1 | 29 | 1 | 41 | 1 | 1 | 0 | 1 | 1 | 1 | 146 | 1 | 3 | 1 | 3 | 1 | 1 | 3 | 2 | 1 | 1 | 5 | 8 | 1 | 1 | 1 | 2 | 1 | 4 | 1 | 1 | 1 | 1 | 0 | 8 | 12 | 1 | 1 | 1 | 1 | 3 | 1 | 3 | 1 | 0 | 1 | 3 | 1 | 1 | 23 | 2 | 1 | 1 | 1 | 2 | 5 | 5 | 9 | 4 | 1 | 0 | 1 | 1 | 1 | 20 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 16 | 1 | 1 | 1 | 0 | 1 | 2 | 1 | 21 | 1 | 1 | 1 | 1 | 3 | 1 | 1 | 1 | 4 | 1 | 1 | 1 | 3 | 15 | 1 | 1 | 2 | 1 | 24 | 1 | 1 | 2 | 0 | 6 | 1 | 1 | 36 | 1 | 2 | 7 | 1 | 1 | 1 | 6 | 1 | 17 | 17 | 2 | 7 | 1 | 1 | 1 | 1 | 1 | 103 | 1 | 1 | 2 | 2 | 1 | 1 | 2 | 1 | 1 | 9 | 124 | 1 | 1 | 1 | 1 | 2 | 1 | 2 | 24 | 1 | 9 | 2 | 2 | 1 | 1 | 1 | 1 | 3 | 1 | 1 | 1 | 1 | 9 | 0 | 1 | 1 | 3 | 4 | 1 | 1 | 1 | 1 | 2 | 9 | 1 | 1 | 2 | 1 | 3 | 12 | 1 | 8 | 1 | 1 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 2 | 1 | 2 | 4 | 71 | 1 | 4 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 2 | 17 | 1 | 1 | 1 | 7 | 1 | 1 | 2 | 1 | 1 | 1 | 5 | 1 | 12 | 5 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 89 | 1 | 1 | 2 | 1 | 1 | 0 | 1 | 1 | 45 | 1 | 14 | 1 | 1 | 1 | 2 | 1 | 1 | 4 | 1 | 1 | 4 | 1 | 68 | 1 | 1 | 2 | 1 | 2 | 1 | 4 | 3 | 1 | 9 | 1 | 1 | 1 | 2 | 1 | 1 | 2 | 1 | 1 | 1 | 2 | 1 | 1 | 22 | 2 | 1 | 1 | 2 | 1 | 1 | 1 | 3 | 1 | 4 | 4 | 1 | 1 | 1 | 2 | 17 | 1 | 3 | 1 | 1 | 2 | 1 | 1 | 12 | 1 | 7 | 3 | 1 | 13 | 4 | 0 | 1 | 1 | 1 | 2 | 0 | 2 | 13 | 1 | 1 | 2 | 1 | 1 | 1 | 36 | 15 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 12 | 1 | 1 | 1 | 2 | 4 | 1 | 67 | 1 | 2 | 1 | 8 | 1 | 7 | 1 | 1 | 1 | 8 | 1 | 5 | 1 | 1 | 1 | 3 | 1 | 2 | 1 | 1 | 4 | 1 | 4 | 1 | 2 | 2 | 1 | 1 | 1 | 7 | 1 | 4 | 1 | 1 | 2 | 1 | 2 | 2 | 3 | 1 | 11 | 1 | 2 | 1 | 2 | 2 | 22 | 1 | 1 | 1 | 1 | 2 | 7 | 1 | 1 | 1 | 1 | 3 | 1 | 1 | 1 | 5 | 2 | 1 | 2 | 1 | 1 | 9 | 2 | 1 | 6 | 46 | 1 | 2 | 3 | 1 | 1 | 1 | 18 | 2 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 2 | 2 | 5 | 1 | 1 | 2 | 1 | 1 | 1 | 23 | 4 | 1 | 1 | 1 | 33 | 3 | 1 | 1 | 2 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | 8 | 1 | 1 | 1 | 6 | 1 | 1 | 8 | 1 | 1 | 1 | 3 | 87 | 1 | 20 | 1 | 2 | 3 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 3 | 1 | 2 | 6 | 3 | 0 | 12 | 1 | 1 | 1 | 2 | 1 | 1 | 23 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 3 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | 9 | 1 | 1 | 1 | 1 | 30 | 3 | 1 | 1 | 2 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 2 | 1 | 4 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 4 | 1 | 2 | 1 | 1 | 7 | 1 | 35 | 1 | 0 | 3 | 1 | 1 | 8 | 1 | 1 | 1 | 1 | 1 | 1 | 3 | 1 | 1 | 1 | 1 | 2 | 1 | 4 | 1 | 1 | 6 | 8 | 1 | 14 | 1 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 2 | 3 | 1 | 1 | 1 | 1 | 2 | 3 | 1 | 2 | 2 | 1 | 15 | 1 | 1 | 0 | 1 | 2 | 1 | 3 | 1 | 1 | 1 | 1 | 2 | 1 | 3 | 1 | 3 | 1 | 1 | 1 | 0 | 1 | 2 | 1 | 1 | 2 | 3 | 1 | 1 | 1 | 1 | 5 | 1 | 1 | 1 | 2 | 2 | 2 | 3 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 |
## Q3. Hotel rates are dynamic and change according to demand and customer demographics. What are the differences in room prices in different market segments?
data.groupby('market_segment_type')['avg_price_per_room'].mean()#.reset_index()
market_segment_type Aviation 103.234256 Complementary 2.773044 Corporate 82.486086 Offline 87.675326 Online 119.891277 Name: avg_price_per_room, dtype: float64
# relationship between market_segment_type and avg_price_per_room
plt.figure(figsize=(15, 5))
sns.barplot('market_segment_type', 'avg_price_per_room', data=data, palette='plasma');
plt.xticks(rotation=90, fontsize=15);
# Q 4. What percentage of bookings are canceled?
labeled_barplot(data, 'booking_status', perc=True)#data,
# Display count plots and pie charts of categorical variables
# function to display count plots and pie charts of categorical variable: language_preferred
colors_list = ['#3366cc','#651593','#a03b87','#e4a859','#da8266',"#FAAE7B", '#ffcc00','#ffff66']
f,ax=plt.subplots(1,2,figsize=(15,5))
data['booking_status'].value_counts().plot.pie(autopct='%1.1f%%',ax=ax[0],startangle=90,colors = colors_list,pctdistance=1.1, labeldistance=1.3)
ax[0].set_title('Booking status')
ax[0].set_ylabel('')
sns.countplot('booking_status',data=data,ax=ax[1],palette =(['#651593','#3366cc','#a03b87','#da8266','#ffcc00',"#FAAE7B",'#e4a859','#ffff66']))
ax[1].set_title('Booking Status')
plt.show()
## Q5. What percentage of repeating guests cancel?
plt.figure(figsize=(10,7))
ax = sns.countplot(data['booking_status'],palette='plasma', hue=data['repeated_guest'])
plt.title('Percentage of booking which are canceled and had special requests')
plt.xlabel('Booking Status')
plt.ylabel('Count')
bar_perc(ax,data['booking_status'])
data.groupby('booking_status')['repeated_guest'].count()#.reset_index()
booking_status Canceled 14487 Not_Canceled 28089 Name: repeated_guest, dtype: int64
pd.crosstab(data.booking_status, data.repeated_guest).sum()
repeated_guest 0 41261 1 1315 dtype: int64
stacked_barplot(data, "repeated_guest", "booking_status")
booking_status Canceled Not_Canceled All repeated_guest All 14487 28089 42576 0 14477 26784 41261 1 10 1305 1315 ------------------------------------------------------------------------------------------------------------------------
## Q6. Many guests have special requirements when booking a hotel room. Do these requirements affect booking cancellation?
plt.figure(figsize=(10,7))
ax = sns.countplot(data['booking_status'],palette='plasma', hue=data['no_of_special_requests'])
plt.title('Percentage of booking which are canceled and had special requests')
plt.xlabel('Booking Status')
plt.ylabel('Count')
bar_perc(ax,data['booking_status'])
stacked_barplot(data, "no_of_special_requests", "booking_status")
booking_status Canceled Not_Canceled All no_of_special_requests All 14487 28089 42576 0 8752 10476 19228 1 4346 11225 15571 2 1389 4992 6381 3 0 1230 1230 4 0 150 150 5 0 16 16 ------------------------------------------------------------------------------------------------------------------------
data.columns
Index(['no_of_adults', 'no_of_children', 'no_of_weekend_nights',
'no_of_week_nights', 'type_of_meal_plan', 'required_car_parking_space',
'room_type_reserved', 'lead_time', 'arrival_year', 'arrival_month',
'arrival_date', 'market_segment_type', 'repeated_guest',
'no_of_previous_cancellations', 'no_of_previous_bookings_not_canceled',
'avg_price_per_room', 'no_of_special_requests', 'booking_status'],
dtype='object')
stacked_barplot(data, "type_of_meal_plan", "booking_status")
booking_status Canceled Not_Canceled All type_of_meal_plan All 14487 28089 42576 Meal Plan 1 10511 21352 31863 Not Selected 3118 5598 8716 Meal Plan 2 857 1132 1989 Meal Plan 3 1 7 8 ------------------------------------------------------------------------------------------------------------------------
stacked_barplot(data, "room_type_reserved", "booking_status")
booking_status Canceled Not_Canceled All room_type_reserved All 14487 28089 42576 Room_Type 1 9225 20505 29730 Room_Type 4 3683 5686 9369 Room_Type 6 826 714 1540 Room_Type 5 367 539 906 Room_Type 2 274 444 718 Room_Type 7 110 197 307 Room_Type 3 2 4 6 ------------------------------------------------------------------------------------------------------------------------
The Chi-Square test is a statistical method to determine if two categorical variables have a significant correlation between them.
Null Hypothesis - There is no association between the two variables.
Alternate Hypothesis - There is an association between two variables.
import scipy.stats as stats
crosstab = pd.crosstab(
data["no_of_special_requests"], data["repeated_guest"]
) # Contingency table of FICO and home_ownership attributes
Ho = "no_of_special_requests score has no effect on repeated_guest" # Stating the Null Hypothesis
Ha = "no_of_special_requests score has an effect on repeated_guest" # Stating the Alternate Hypothesis
chi, p_value, dof, expected = stats.chi2_contingency(crosstab)
if p_value < 0.05: # Setting our significance level at 5%
print(f"{Ha} as the p_value ({p_value.round(3)}) < 0.05")
else:
print(f"{Ho} as the p_value ({p_value.round(3)}) > 0.05")
no_of_special_requests score has an effect on repeated_guest as the p_value (0.0) < 0.05
import scipy.stats as stats
crosstab = pd.crosstab(
data["market_segment_type"], data["repeated_guest"]
) # Contingency table of FICO and home_ownership attributes
Ho = "no_of_special_requests score has no effect on repeated_guest" # Stating the Null Hypothesis
Ha = "no_of_special_requests score has an effect on repeated_guest" # Stating the Alternate Hypothesis
chi, p_value, dof, expected = stats.chi2_contingency(crosstab)
if p_value < 0.05: # Setting our significance level at 5%
print(f"{Ha} as the p_value ({p_value.round(3)}) < 0.05")
else:
print(f"{Ho} as the p_value ({p_value.round(3)}) > 0.05")
no_of_special_requests score has an effect on repeated_guest as the p_value (0.0) < 0.05
import scipy.stats as stats
crosstab = pd.crosstab(
data["no_of_special_requests"], data["market_segment_type"]
) # Contingency table of FICO and home_ownership attributes
Ho = "no_of_special_requests score has no effect on repeated_guest" # Stating the Null Hypothesis
Ha = "no_of_special_requests score has an effect on repeated_guest" # Stating the Alternate Hypothesis
chi, p_value, dof, expected = stats.chi2_contingency(crosstab)
if p_value < 0.05: # Setting our significance level at 5%
print(f"{Ha} as the p_value ({p_value.round(3)}) < 0.05")
else:
print(f"{Ho} as the p_value ({p_value.round(3)}) > 0.05")
no_of_special_requests score has an effect on repeated_guest as the p_value (0.0) < 0.05
# Function to create barplots that indicate percentage for each category.
def bar_perc(plot, feature):
'''
plot
feature: 1-d categorical feature array
'''
total = len(feature) # length of the column
for p in plot.patches:
percentage = '{:.1f}%'.format(100 * p.get_height()/total) # percentage of each class of the category
x = p.get_x() + p.get_width() / 2 - 0.05 # width of the plot
y = p.get_y() + p.get_height() # hieght of the plot
plot.annotate(percentage, (x, y), size = 12) # annotate the percentage
#fig1, axes1 =plt.subplots(1,4,figsize=(14,5))
#
# count-percentage bar plot of converted users of all groups
## I will use the crosstab to explore two categorical values
# At index I will use set my variable that I want analyse and cross by another
crosstab_eda = pd.crosstab(index=data['booking_status'], normalize=True,
# at this line, I am using the isin to select just the top 5 of browsers
columns=data[data['no_of_special_requests'].isin(data['no_of_special_requests']\
.value_counts().index.values)]['no_of_special_requests'])
# Ploting the crosstab that we did above
axis1=crosstab_eda.plot(kind="bar", # select the bar to plot the count of categoricals
figsize=(15,7), # adjusting the size of graphs
stacked=True) # code to unstack
plt.title("Count of special request users with booking status", fontsize=20) # seting the title size
plt.xlabel("The group name", fontsize=18) # seting the x label size
plt.ylabel("Count", fontsize=18) # seting the y label size
plt.xticks(rotation=0)
bar_perc(axis1,crosstab_eda)
plt.show() # rendering
numerical_col = data.select_dtypes(include=np.number).columns.tolist()
numerical_col
['no_of_adults', 'no_of_children', 'no_of_weekend_nights', 'no_of_week_nights', 'required_car_parking_space', 'lead_time', 'arrival_year', 'arrival_month', 'arrival_date', 'repeated_guest', 'no_of_previous_cancellations', 'no_of_previous_bookings_not_canceled', 'avg_price_per_room', 'no_of_special_requests']
# display box plots of all variables VS time_spent_on_the_page
num_col=['no_of_adults',
'no_of_children',
'no_of_weekend_nights',
'no_of_week_nights',
'required_car_parking_space',
'lead_time',
'repeated_guest',
'no_of_previous_cancellations',
'no_of_previous_bookings_not_canceled',
'avg_price_per_room',
'no_of_special_requests']
for i in range(len(num_col)):
order = data[num_col[i]].value_counts(ascending=False).index # to display bar in ascending order
axis=sns.boxplot('booking_status',num_col[i], data=data,palette= 'winter').set(title=("booking_status VS "+num_col[i]).upper())
plt.show()
# display box plots of all variables VS time_spent_on_the_page
num_col=['no_of_adults',
'no_of_children',
'no_of_weekend_nights',
'no_of_week_nights',
'required_car_parking_space',
'lead_time',
'repeated_guest',
'no_of_previous_cancellations',
'no_of_previous_bookings_not_canceled',
'avg_price_per_room',
'no_of_special_requests']
for i in range(len(num_col)):
order = data[num_col[i]].value_counts(ascending=False).index # to display bar in ascending order
axis=sns.boxplot('booking_status',num_col[i], data=data,palette= 'winter', showfliers=False).set(title=("booking_status VS "+num_col[i]).upper())
plt.show()
If outliers are removed,
# Display hist plots of all variables VS time_spent_on_the_page
#list_col=['group','converted','landing_page','language_preferred']
for i in range(len(cat_columns)):
order = data[cat_columns[i]].value_counts(ascending=False).index # to display bar in ascending order
axis=sns.histplot(data=data, x="avg_price_per_room", hue=cat_columns[i], kde = True)
plt.title("avg_price_per_room vs "+ str(cat_columns[i]));
plt.show()
# Display hist plots of all variables VS time_spent_on_the_page
#list_col=['group','converted','landing_page','language_preferred']
for i in range(len(cat_columns)):
order = data[cat_columns[i]].value_counts(ascending=False).index # to display bar in ascending order
axis=sns.histplot(data=data, x="lead_time", hue=cat_columns[i], kde = True)
# plt.title("lead_time vs "+ str(cat_columns[i]));
plt.show()
for i in range(len(cat_columns)):
order = data[cat_columns[i]].value_counts(ascending=False).index # to display bar in ascending order
axis=sns.histplot(data=data, x="no_of_children", hue=cat_columns[i], kde = True)
# plt.title("no_of_children vs "+ str(cat_columns[i]));
plt.show()
# It will show correlation matrix
data.corr()
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | required_car_parking_space | lead_time | arrival_year | arrival_month | arrival_date | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| no_of_adults | 1.000000 | -0.046768 | 0.088448 | 0.114718 | -0.013978 | 0.157586 | 0.089816 | 0.001775 | 0.007152 | -0.248220 | -0.082402 | -0.151376 | 0.352854 | 0.113269 |
| no_of_children | -0.046768 | 1.000000 | 0.015463 | 0.022396 | 0.015151 | 0.036515 | 0.012982 | 0.013723 | 0.016474 | -0.048387 | -0.021786 | -0.029038 | 0.344863 | 0.063826 |
| no_of_weekend_nights | 0.088448 | 0.015463 | 1.000000 | 0.234575 | -0.054138 | 0.116011 | 0.025955 | 0.000370 | 0.000177 | -0.096068 | -0.036461 | -0.048818 | 0.002365 | 0.006193 |
| no_of_week_nights | 0.114718 | 0.022396 | 0.234575 | 1.000000 | -0.061178 | 0.209997 | 0.049051 | -0.000454 | -0.014510 | -0.121374 | -0.039081 | -0.058228 | 0.024760 | 0.026863 |
| required_car_parking_space | -0.013978 | 0.015151 | -0.054138 | -0.061178 | 1.000000 | -0.046068 | -0.046091 | 0.009392 | 0.000628 | 0.122087 | 0.035573 | 0.073901 | 0.026600 | 0.065491 |
| lead_time | 0.157586 | 0.036515 | 0.116011 | 0.209997 | -0.046068 | 1.000000 | 0.210627 | 0.105792 | 0.036721 | -0.154935 | -0.060561 | -0.088774 | 0.007367 | 0.024544 |
| arrival_year | 0.089816 | 0.012982 | 0.025955 | 0.049051 | -0.046091 | 0.210627 | 1.000000 | -0.471120 | -0.003047 | -0.015335 | -0.005479 | 0.012817 | 0.239247 | 0.034592 |
| arrival_month | 0.001775 | 0.013723 | 0.000370 | -0.000454 | 0.009392 | 0.105792 | -0.471120 | 1.000000 | -0.005708 | -0.008168 | -0.029231 | -0.009322 | 0.065882 | 0.069201 |
| arrival_date | 0.007152 | 0.016474 | 0.000177 | -0.014510 | 0.000628 | 0.036721 | -0.003047 | -0.005708 | 1.000000 | -0.010135 | -0.008540 | -0.000034 | 0.016588 | -0.001544 |
| repeated_guest | -0.248220 | -0.048387 | -0.096068 | -0.121374 | 0.122087 | -0.154935 | -0.015335 | -0.008168 | -0.010135 | 1.000000 | 0.397426 | 0.556413 | -0.200056 | 0.002098 |
| no_of_previous_cancellations | -0.082402 | -0.021786 | -0.036461 | -0.039081 | 0.035573 | -0.060561 | -0.005479 | -0.029231 | -0.008540 | 0.397426 | 1.000000 | 0.582212 | -0.084619 | 0.010017 |
| no_of_previous_bookings_not_canceled | -0.151376 | -0.029038 | -0.048818 | -0.058228 | 0.073901 | -0.088774 | 0.012817 | -0.009322 | -0.000034 | 0.556413 | 0.582212 | 1.000000 | -0.124801 | 0.034580 |
| avg_price_per_room | 0.352854 | 0.344863 | 0.002365 | 0.024760 | 0.026600 | 0.007367 | 0.239247 | 0.065882 | 0.016588 | -0.200056 | -0.084619 | -0.124801 | 1.000000 | 0.128621 |
| no_of_special_requests | 0.113269 | 0.063826 | 0.006193 | 0.026863 | 0.065491 | 0.024544 | 0.034592 | 0.069201 | -0.001544 | 0.002098 | 0.010017 | 0.034580 | 0.128621 | 1.000000 |
# However we want to see correlation in graphical representation so below is function for that
def plot_corr(df, size=11):
corr = df.corr()
fig, ax = plt.subplots(figsize=(size, size))
ax.matshow(corr)
plt.xticks(range(len(corr.columns)), corr.columns)
plt.yticks(range(len(corr.columns)), corr.columns)
for (i, j), z in np.ndenumerate(corr):
ax.text(j, i, '{:0.1f}'.format(z), ha='center', va='center')
plt.figure(figsize=(10,8))
sns.heatmap(data.corr(),
annot=True,
linewidths=.5,
center=0,
cbar=False,
cmap="YlGnBu")
plt.show()
sns.pairplot(data,diag_kind='kde')
<seaborn.axisgrid.PairGrid at 0x156c4f14b50>
n_true = len(data.loc[data['booking_status'] == "Canceled"])
n_false = len(data.loc[data['booking_status'] == "Not_Canceled"])
#print("Number of true cases: {0} ".format(n_true))
#print("Number of false cases: {0} ".format(n_false, ))
#data['booking_status']
print("Number of booking status cancelled: {0} ({1:2.2f}%)".format(n_true, (n_true / (n_true + n_false)) * 100 ))
print("Number of false cases: {0} ({1:2.2f}%)".format(n_false, (n_false / (n_true + n_false)) * 100))
Number of booking status cancelled: 14487 (34.03%) Number of false cases: 28089 (65.97%)
#data=df_copy.copy()
data.sample(10)
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 16836 | 1 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 7 | 2018 | 6 | 13 | Online | 0 | 0 | 0 | 95.00 | 1 | Not_Canceled |
| 34465 | 2 | 2 | 0 | 1 | Meal Plan 1 | 1 | Room_Type 7 | 15 | 2019 | 7 | 12 | Online | 0 | 0 | 0 | 243.96 | 1 | Not_Canceled |
| 24805 | 1 | 0 | 0 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 32 | 2018 | 11 | 8 | Online | 0 | 0 | 0 | 143.28 | 0 | Canceled |
| 39679 | 2 | 0 | 1 | 1 | Not Selected | 0 | Room_Type 1 | 28 | 2019 | 4 | 29 | Online | 0 | 0 | 0 | 115.00 | 0 | Canceled |
| 55974 | 1 | 0 | 0 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 15 | 2019 | 6 | 21 | Online | 0 | 0 | 0 | 170.00 | 1 | Canceled |
| 46050 | 2 | 0 | 2 | 3 | Meal Plan 1 | 0 | Room_Type 2 | 108 | 2018 | 3 | 24 | Online | 0 | 0 | 0 | 120.00 | 1 | Not_Canceled |
| 30175 | 2 | 0 | 0 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 0 | 2018 | 11 | 30 | Corporate | 0 | 0 | 0 | 73.00 | 1 | Not_Canceled |
| 36912 | 2 | 0 | 2 | 4 | Meal Plan 1 | 0 | Room_Type 1 | 29 | 2017 | 12 | 30 | Offline | 0 | 0 | 0 | 84.67 | 0 | Not_Canceled |
| 56031 | 2 | 0 | 0 | 2 | Not Selected | 0 | Room_Type 1 | 323 | 2019 | 5 | 18 | Online | 0 | 0 | 0 | 99.00 | 0 | Canceled |
| 27144 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 46 | 2019 | 4 | 17 | Online | 0 | 0 | 0 | 126.00 | 1 | Not_Canceled |
Arrival date, month and year can be combined to get single date and to reduce dimensiality.
There are no missing values
Questions:
What are the differences in room prices in different market segments?
What percentage of bookings are canceled?
If outliers are removed,
# rows where month is February and days are greater than 28:
#data=df.copy()
feb_d=data[(data['arrival_date']>28) & (data['arrival_month']==2)].index
feb_d.value_counts().sum()
35
data.loc[feb_d]
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 4108 | 2 | 0 | 1 | 5 | Meal Plan 1 | 0 | Room_Type 1 | 104 | 2018 | 2 | 29 | Online | 1 | 1 | 0 | 61.43 | 0 | Canceled |
| 5769 | 1 | 0 | 1 | 3 | Meal Plan 1 | 0 | Room_Type 1 | 21 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 102.05 | 0 | Canceled |
| 8778 | 2 | 0 | 1 | 3 | Meal Plan 1 | 0 | Room_Type 1 | 24 | 2018 | 2 | 29 | Offline | 0 | 0 | 0 | 45.50 | 0 | Not_Canceled |
| 9928 | 1 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 117 | 2018 | 2 | 29 | Offline | 0 | 0 | 0 | 76.00 | 0 | Not_Canceled |
| 11990 | 2 | 1 | 1 | 5 | Meal Plan 1 | 0 | Room_Type 1 | 35 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 98.10 | 1 | Canceled |
| 12525 | 2 | 2 | 1 | 3 | Meal Plan 1 | 0 | Room_Type 6 | 3 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 183.00 | 1 | Not_Canceled |
| 14071 | 1 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 117 | 2018 | 2 | 29 | Offline | 0 | 0 | 0 | 76.00 | 0 | Not_Canceled |
| 14345 | 2 | 2 | 1 | 3 | Meal Plan 1 | 0 | Room_Type 6 | 3 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 189.75 | 0 | Not_Canceled |
| 14490 | 2 | 0 | 1 | 3 | Meal Plan 1 | 0 | Room_Type 4 | 15 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 85.55 | 1 | Not_Canceled |
| 15144 | 1 | 0 | 1 | 0 | Meal Plan 1 | 0 | Room_Type 4 | 21 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 117.00 | 0 | Not_Canceled |
| 15578 | 1 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 45 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 76.30 | 0 | Not_Canceled |
| 16630 | 2 | 0 | 1 | 3 | Meal Plan 1 | 1 | Room_Type 4 | 47 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 99.40 | 1 | Not_Canceled |
| 16725 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 117 | 2018 | 2 | 29 | Offline | 0 | 0 | 0 | 86.33 | 0 | Not_Canceled |
| 16859 | 2 | 0 | 1 | 3 | Meal Plan 1 | 0 | Room_Type 1 | 88 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 56.94 | 0 | Canceled |
| 18642 | 1 | 0 | 3 | 7 | Meal Plan 1 | 0 | Room_Type 1 | 58 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 66.45 | 1 | Not_Canceled |
| 21961 | 1 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 0 | 2018 | 2 | 29 | Complementary | 0 | 0 | 0 | 3.00 | 0 | Not_Canceled |
| 22502 | 2 | 0 | 1 | 3 | Meal Plan 2 | 0 | Room_Type 1 | 13 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 114.55 | 0 | Not_Canceled |
| 24178 | 1 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 61 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 78.90 | 1 | Canceled |
| 24297 | 1 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 1 | 2018 | 2 | 29 | Offline | 0 | 0 | 0 | 76.00 | 0 | Not_Canceled |
| 27046 | 2 | 0 | 1 | 3 | Meal Plan 2 | 0 | Room_Type 1 | 13 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 107.80 | 0 | Not_Canceled |
| 29149 | 1 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 5 | 3 | 2018 | 2 | 29 | Corporate | 0 | 0 | 0 | 107.00 | 0 | Not_Canceled |
| 29374 | 1 | 0 | 1 | 4 | Meal Plan 1 | 1 | Room_Type 1 | 4 | 2018 | 2 | 29 | Corporate | 1 | 0 | 11 | 68.00 | 1 | Not_Canceled |
| 29889 | 1 | 0 | 1 | 1 | Meal Plan 1 | 1 | Room_Type 1 | 7 | 2018 | 2 | 29 | Corporate | 0 | 0 | 0 | 68.00 | 0 | Not_Canceled |
| 32057 | 2 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 4 | 33 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 106.40 | 0 | Not_Canceled |
| 34023 | 1 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 117 | 2018 | 2 | 29 | Offline | 0 | 0 | 0 | 75.00 | 0 | Not_Canceled |
| 34040 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 4 | 57 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 95.30 | 1 | Not_Canceled |
| 40946 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 39 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 60.10 | 0 | Not_Canceled |
| 43226 | 1 | 0 | 1 | 0 | Meal Plan 1 | 0 | Room_Type 1 | 0 | 2018 | 2 | 29 | Corporate | 1 | 0 | 10 | 65.00 | 1 | Not_Canceled |
| 43810 | 2 | 0 | 1 | 5 | Meal Plan 1 | 0 | Room_Type 4 | 115 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 102.33 | 1 | Not_Canceled |
| 47943 | 2 | 1 | 1 | 3 | Meal Plan 1 | 0 | Room_Type 4 | 13 | 2018 | 2 | 29 | Offline | 0 | 0 | 0 | 86.80 | 0 | Canceled |
| 48051 | 1 | 0 | 1 | 0 | Meal Plan 1 | 0 | Room_Type 5 | 21 | 2018 | 2 | 29 | Offline | 0 | 0 | 0 | 142.00 | 0 | Not_Canceled |
| 48069 | 3 | 0 | 1 | 2 | Meal Plan 2 | 0 | Room_Type 4 | 7 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 193.00 | 2 | Not_Canceled |
| 50279 | 2 | 0 | 1 | 0 | Not Selected | 0 | Room_Type 1 | 50 | 2018 | 2 | 29 | Online | 0 | 0 | 0 | 76.50 | 0 | Canceled |
| 54359 | 1 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 3 | 2018 | 2 | 29 | Corporate | 1 | 0 | 1 | 66.00 | 0 | Not_Canceled |
| 55677 | 1 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 7 | 2018 | 2 | 29 | Corporate | 0 | 0 | 0 | 66.00 | 0 | Not_Canceled |
data.loc[feb_d,['arrival_date','arrival_month','arrival_year']]
| arrival_date | arrival_month | arrival_year | |
|---|---|---|---|
| 4108 | 29 | 2 | 2018 |
| 5769 | 29 | 2 | 2018 |
| 8778 | 29 | 2 | 2018 |
| 9928 | 29 | 2 | 2018 |
| 11990 | 29 | 2 | 2018 |
| 12525 | 29 | 2 | 2018 |
| 14071 | 29 | 2 | 2018 |
| 14345 | 29 | 2 | 2018 |
| 14490 | 29 | 2 | 2018 |
| 15144 | 29 | 2 | 2018 |
| 15578 | 29 | 2 | 2018 |
| 16630 | 29 | 2 | 2018 |
| 16725 | 29 | 2 | 2018 |
| 16859 | 29 | 2 | 2018 |
| 18642 | 29 | 2 | 2018 |
| 21961 | 29 | 2 | 2018 |
| 22502 | 29 | 2 | 2018 |
| 24178 | 29 | 2 | 2018 |
| 24297 | 29 | 2 | 2018 |
| 27046 | 29 | 2 | 2018 |
| 29149 | 29 | 2 | 2018 |
| 29374 | 29 | 2 | 2018 |
| 29889 | 29 | 2 | 2018 |
| 32057 | 29 | 2 | 2018 |
| 34023 | 29 | 2 | 2018 |
| 34040 | 29 | 2 | 2018 |
| 40946 | 29 | 2 | 2018 |
| 43226 | 29 | 2 | 2018 |
| 43810 | 29 | 2 | 2018 |
| 47943 | 29 | 2 | 2018 |
| 48051 | 29 | 2 | 2018 |
| 48069 | 29 | 2 | 2018 |
| 50279 | 29 | 2 | 2018 |
| 54359 | 29 | 2 | 2018 |
| 55677 | 29 | 2 | 2018 |
data.drop(feb_d, inplace=True)
data.shape
(42541, 18)
# joining the date, month and year to make a column
df=data.copy()
#df ['arrival_date_1'] = pd.to_datetime(df ['arrival_date'],format='%d-%m-%Y', errors='coerce').dt.date
#data['arrival_date_full']=pd.to_datetime(data[['arrival_year','arrival_month','arrival_date']].astype(str).agg('-'.join, axis=1))
data['arrival_date_full']=data[['arrival_year','arrival_month','arrival_date']].astype(str).agg('-'.join, axis=1)
data['arrival_date_full']=pd.to_datetime(data['arrival_date_full'])
data['arrival_date_full']
#df ['arrival_date_1']
0 2017-10-02
1 2018-11-06
2 2018-02-28
3 2018-05-20
4 2019-07-13
...
56920 2018-07-01
56921 2019-06-15
56922 2019-05-15
56923 2018-04-21
56924 2019-04-28
Name: arrival_date_full, Length: 42541, dtype: datetime64[ns]
#then we create the function to implement it to all values from the column of interest.
def date_to_week(data_value):
return data_value.weekday()
data['Day of the Week'] = data['arrival_date_full'].apply(date_to_week)
data.head()
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | arrival_date_full | Day of the Week | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 224 | 2017 | 10 | 2 | Offline | 0 | 0 | 0 | 65.00 | 0 | Not_Canceled | 2017-10-02 | 0 |
| 1 | 2 | 0 | 2 | 3 | Not Selected | 0 | Room_Type 1 | 5 | 2018 | 11 | 6 | Online | 0 | 0 | 0 | 106.68 | 1 | Not_Canceled | 2018-11-06 | 1 |
| 2 | 1 | 0 | 2 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 1 | 2018 | 2 | 28 | Online | 0 | 0 | 0 | 60.00 | 0 | Canceled | 2018-02-28 | 2 |
| 3 | 2 | 0 | 0 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 211 | 2018 | 5 | 20 | Online | 0 | 0 | 0 | 100.00 | 0 | Canceled | 2018-05-20 | 6 |
| 4 | 3 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 277 | 2019 | 7 | 13 | Online | 0 | 0 | 0 | 89.10 | 2 | Canceled | 2019-07-13 | 5 |
# Exctracting Week days from dates
weekday={0:"Monday",
1:"Tuesday",
2:"Wednesday",
3:"Thursday",
4:"Friday",
5: "Saturday",
6:"Sunday"}
data["Week"]=data['Day of the Week'].replace(weekday)
data
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | arrival_date_full | Day of the Week | Week | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 224 | 2017 | 10 | 2 | Offline | 0 | 0 | 0 | 65.00 | 0 | Not_Canceled | 2017-10-02 | 0 | Monday |
| 1 | 2 | 0 | 2 | 3 | Not Selected | 0 | Room_Type 1 | 5 | 2018 | 11 | 6 | Online | 0 | 0 | 0 | 106.68 | 1 | Not_Canceled | 2018-11-06 | 1 | Tuesday |
| 2 | 1 | 0 | 2 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 1 | 2018 | 2 | 28 | Online | 0 | 0 | 0 | 60.00 | 0 | Canceled | 2018-02-28 | 2 | Wednesday |
| 3 | 2 | 0 | 0 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 211 | 2018 | 5 | 20 | Online | 0 | 0 | 0 | 100.00 | 0 | Canceled | 2018-05-20 | 6 | Sunday |
| 4 | 3 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 277 | 2019 | 7 | 13 | Online | 0 | 0 | 0 | 89.10 | 2 | Canceled | 2019-07-13 | 5 | Saturday |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 56920 | 2 | 0 | 2 | 6 | Meal Plan 1 | 0 | Room_Type 1 | 148 | 2018 | 7 | 1 | Online | 0 | 0 | 0 | 98.39 | 2 | Not_Canceled | 2018-07-01 | 6 | Sunday |
| 56921 | 2 | 1 | 0 | 1 | Meal Plan 2 | 0 | Room_Type 4 | 45 | 2019 | 6 | 15 | Online | 0 | 0 | 0 | 163.88 | 1 | Not_Canceled | 2019-06-15 | 5 | Saturday |
| 56922 | 2 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 320 | 2019 | 5 | 15 | Offline | 0 | 0 | 0 | 90.00 | 1 | Canceled | 2019-05-15 | 2 | Wednesday |
| 56923 | 2 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 63 | 2018 | 4 | 21 | Online | 0 | 0 | 0 | 94.50 | 0 | Canceled | 2018-04-21 | 5 | Saturday |
| 56924 | 2 | 0 | 2 | 2 | Not Selected | 0 | Room_Type 1 | 6 | 2019 | 4 | 28 | Online | 0 | 0 | 0 | 162.50 | 2 | Not_Canceled | 2019-04-28 | 6 | Sunday |
42541 rows × 21 columns
## Q7. Does the week day affect booking cancellation?
plt.figure(figsize=(10,7))
ax = sns.countplot(data['booking_status'],palette='winter', hue=data['Week'])
plt.title('Percentage of booking which are canceled on a weekday')
plt.xlabel('Week days')
plt.ylabel('Count')
bar_perc(ax,data['booking_status'])
data
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | arrival_date_full | Day of the Week | Week | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 224 | 2017 | 10 | 2 | Offline | 0 | 0 | 0 | 65.00 | 0 | Not_Canceled | 2017-10-02 | 0 | Monday |
| 1 | 2 | 0 | 2 | 3 | Not Selected | 0 | Room_Type 1 | 5 | 2018 | 11 | 6 | Online | 0 | 0 | 0 | 106.68 | 1 | Not_Canceled | 2018-11-06 | 1 | Tuesday |
| 2 | 1 | 0 | 2 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 1 | 2018 | 2 | 28 | Online | 0 | 0 | 0 | 60.00 | 0 | Canceled | 2018-02-28 | 2 | Wednesday |
| 3 | 2 | 0 | 0 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 211 | 2018 | 5 | 20 | Online | 0 | 0 | 0 | 100.00 | 0 | Canceled | 2018-05-20 | 6 | Sunday |
| 4 | 3 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 277 | 2019 | 7 | 13 | Online | 0 | 0 | 0 | 89.10 | 2 | Canceled | 2019-07-13 | 5 | Saturday |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 56920 | 2 | 0 | 2 | 6 | Meal Plan 1 | 0 | Room_Type 1 | 148 | 2018 | 7 | 1 | Online | 0 | 0 | 0 | 98.39 | 2 | Not_Canceled | 2018-07-01 | 6 | Sunday |
| 56921 | 2 | 1 | 0 | 1 | Meal Plan 2 | 0 | Room_Type 4 | 45 | 2019 | 6 | 15 | Online | 0 | 0 | 0 | 163.88 | 1 | Not_Canceled | 2019-06-15 | 5 | Saturday |
| 56922 | 2 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 320 | 2019 | 5 | 15 | Offline | 0 | 0 | 0 | 90.00 | 1 | Canceled | 2019-05-15 | 2 | Wednesday |
| 56923 | 2 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 63 | 2018 | 4 | 21 | Online | 0 | 0 | 0 | 94.50 | 0 | Canceled | 2018-04-21 | 5 | Saturday |
| 56924 | 2 | 0 | 2 | 2 | Not Selected | 0 | Room_Type 1 | 6 | 2019 | 4 | 28 | Online | 0 | 0 | 0 | 162.50 | 2 | Not_Canceled | 2019-04-28 | 6 | Sunday |
42541 rows × 21 columns
data.drop(['Week'], axis=1, inplace=True)
data
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | arrival_year | arrival_month | arrival_date | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | arrival_date_full | Day of the Week | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 224 | 2017 | 10 | 2 | Offline | 0 | 0 | 0 | 65.00 | 0 | Not_Canceled | 2017-10-02 | 0 |
| 1 | 2 | 0 | 2 | 3 | Not Selected | 0 | Room_Type 1 | 5 | 2018 | 11 | 6 | Online | 0 | 0 | 0 | 106.68 | 1 | Not_Canceled | 2018-11-06 | 1 |
| 2 | 1 | 0 | 2 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 1 | 2018 | 2 | 28 | Online | 0 | 0 | 0 | 60.00 | 0 | Canceled | 2018-02-28 | 2 |
| 3 | 2 | 0 | 0 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 211 | 2018 | 5 | 20 | Online | 0 | 0 | 0 | 100.00 | 0 | Canceled | 2018-05-20 | 6 |
| 4 | 3 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 277 | 2019 | 7 | 13 | Online | 0 | 0 | 0 | 89.10 | 2 | Canceled | 2019-07-13 | 5 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 56920 | 2 | 0 | 2 | 6 | Meal Plan 1 | 0 | Room_Type 1 | 148 | 2018 | 7 | 1 | Online | 0 | 0 | 0 | 98.39 | 2 | Not_Canceled | 2018-07-01 | 6 |
| 56921 | 2 | 1 | 0 | 1 | Meal Plan 2 | 0 | Room_Type 4 | 45 | 2019 | 6 | 15 | Online | 0 | 0 | 0 | 163.88 | 1 | Not_Canceled | 2019-06-15 | 5 |
| 56922 | 2 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 320 | 2019 | 5 | 15 | Offline | 0 | 0 | 0 | 90.00 | 1 | Canceled | 2019-05-15 | 2 |
| 56923 | 2 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 63 | 2018 | 4 | 21 | Online | 0 | 0 | 0 | 94.50 | 0 | Canceled | 2018-04-21 | 5 |
| 56924 | 2 | 0 | 2 | 2 | Not Selected | 0 | Room_Type 1 | 6 | 2019 | 4 | 28 | Online | 0 | 0 | 0 | 162.50 | 2 | Not_Canceled | 2019-04-28 | 6 |
42541 rows × 20 columns
df=data.copy()
#data=df.copy()
# Treating Outliers of the rest of the numeric variables
# function to treat outliers by converting
def treat_outliers(df, col):
"""
treats outliers in a variable
col: str, name of the numerical variable
df: dataframe
col: name of the column
"""
Q1 = df[col].quantile(0.25) # 25th quantile
Q3 = df[col].quantile(0.75) # 75th quantile
IQR = Q3 - Q1
#Lower_Whisker = Q1 - 1.5 * IQR
#Upper_Whisker = Q3 + 1.5 * IQR
Lower_Whisker = Q1 - 3 * IQR
Upper_Whisker = Q3 + 3 * IQR
# all the values smaller than Lower_Whisker will be assigned the value of Lower_Whisker
# all the values greater than Upper_Whisker will be assigned the value of Upper_Whisker
df[col] = np.clip(df[col], Lower_Whisker, Upper_Whisker)
return df
#def treat_outliers_all(df, col_list):
"""
treat outlier in all numerical variables
col_list: list of numerical variables
df: data frame
"""
# distributions after treating outliers
cols_to_ol = ['no_of_adults',
'no_of_children',
'no_of_weekend_nights',
'no_of_week_nights',
'required_car_parking_space',
'lead_time',
'arrival_year',
'arrival_month',
'arrival_date',
'repeated_guest',
'no_of_previous_cancellations',
'no_of_previous_bookings_not_canceled',
'avg_price_per_room',
'no_of_special_requests']# 'ram',],'release_year',,'days_used','int_memory''main_camera_mp','selfie_camera_mp',
for colname in cols_to_ol:
data = treat_outliers(data, colname) # applying outlier removal function
sns.histplot(data[colname], bins=50, kde=True)
plt.show()
#def treat_outliers_all(df, col_list):
"""
treat outlier in all numerical variables
col_list: list of numerical variables
df: data frame
"""
# for c in col_list:
# df = treat_outliers(df, c)
# return df
#BOXPLOT after outliers removal
# let's plot the boxplots of all columns to check for outliers
numeric_columns=['no_of_adults',
'no_of_children',
'no_of_weekend_nights',
'no_of_week_nights',
'required_car_parking_space',
'lead_time',
'arrival_year',
'arrival_month',
'arrival_date',
'repeated_guest',
'no_of_previous_cancellations',
'no_of_previous_bookings_not_canceled',
'avg_price_per_room',
'no_of_special_requests']
plt.figure(figsize=(20, 30))
for i, variable in enumerate(numeric_columns):
plt.subplot(5, 4, i + 1)
plt.boxplot(data[variable], whis=1.5)
plt.tight_layout()
plt.title(variable)
plt.show()
bk_variable = {'Not_Canceled' : 0 , 'Canceled' : 1 }
data['booking_status'] = data['booking_status'].map(bk_variable)
#data['booking_status'] = data['booking_status'].apply(lambda x: 1 if x == "Canceled" else 0)
data.booking_status.value_counts()
0 28061 1 14480 Name: booking_status, dtype: int64
data['booking_status'] = data['booking_status'].astype(int)
data.booking_status
0 0
1 0
2 1
3 1
4 1
..
56920 0
56921 0
56922 1
56923 1
56924 0
Name: booking_status, Length: 42541, dtype: int32
data.drop(["arrival_year", "arrival_month", "arrival_date"], axis=1,inplace = True)
data
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | arrival_date_full | Day of the Week | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 224 | Offline | 0 | 0 | 0 | 65.00 | 0 | 0 | 2017-10-02 | 0 |
| 1 | 2 | 0 | 2 | 3 | Not Selected | 0 | Room_Type 1 | 5 | Online | 0 | 0 | 0 | 106.68 | 1 | 0 | 2018-11-06 | 1 |
| 2 | 2 | 0 | 2 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 1 | Online | 0 | 0 | 0 | 60.00 | 0 | 1 | 2018-02-28 | 2 |
| 3 | 2 | 0 | 0 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 211 | Online | 0 | 0 | 0 | 100.00 | 0 | 1 | 2018-05-20 | 6 |
| 4 | 2 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 277 | Online | 0 | 0 | 0 | 89.10 | 2 | 1 | 2019-07-13 | 5 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 56920 | 2 | 0 | 2 | 6 | Meal Plan 1 | 0 | Room_Type 1 | 148 | Online | 0 | 0 | 0 | 98.39 | 2 | 0 | 2018-07-01 | 6 |
| 56921 | 2 | 0 | 0 | 1 | Meal Plan 2 | 0 | Room_Type 4 | 45 | Online | 0 | 0 | 0 | 163.88 | 1 | 0 | 2019-06-15 | 5 |
| 56922 | 2 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 320 | Offline | 0 | 0 | 0 | 90.00 | 1 | 1 | 2019-05-15 | 2 |
| 56923 | 2 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 63 | Online | 0 | 0 | 0 | 94.50 | 0 | 1 | 2018-04-21 | 5 |
| 56924 | 2 | 0 | 2 | 2 | Not Selected | 0 | Room_Type 1 | 6 | Online | 0 | 0 | 0 | 162.50 | 2 | 0 | 2019-04-28 | 6 |
42541 rows × 17 columns
data['arrival_date_full'] = pd.to_datetime(data['arrival_date_full'],infer_datetime_format=True)
pd.PeriodIndex(data.arrival_date_full, freq='M')
PeriodIndex(['2017-10', '2018-11', '2018-02', '2018-05', '2019-07', '2018-04',
'2019-06', '2018-09', '2017-10', '2019-04',
...
'2019-07', '2019-05', '2018-08', '2018-10', '2019-06', '2018-07',
'2019-06', '2019-05', '2018-04', '2019-04'],
dtype='period[M]', name='arrival_date_full', length=42541, freq='M')
data['yearly_MONTHS'] = pd.PeriodIndex(data.arrival_date_full, freq='M')
data[['yearly_MONTHS' , 'arrival_date_full']]
| yearly_MONTHS | arrival_date_full | |
|---|---|---|
| 0 | 2017-10 | 2017-10-02 |
| 1 | 2018-11 | 2018-11-06 |
| 2 | 2018-02 | 2018-02-28 |
| 3 | 2018-05 | 2018-05-20 |
| 4 | 2019-07 | 2019-07-13 |
| ... | ... | ... |
| 56920 | 2018-07 | 2018-07-01 |
| 56921 | 2019-06 | 2019-06-15 |
| 56922 | 2019-05 | 2019-05-15 |
| 56923 | 2018-04 | 2018-04-21 |
| 56924 | 2019-04 | 2019-04-28 |
42541 rows × 2 columns
# function to convert arrival_date_full to ordinal to be read by Logistic Regression model
# data['arrival_date_full']=data['arrival_date_full'].apply(lambda x: x.toordinal())
data.yearly_MONTHS.unique()
<PeriodArray> ['2017-10', '2018-11', '2018-02', '2018-05', '2019-07', '2018-04', '2019-06', '2018-09', '2019-04', '2018-12', '2018-07', '2018-10', '2019-05', '2018-06', '2017-08', '2019-03', '2018-03', '2019-08', '2017-09', '2017-11', '2018-01', '2017-12', '2019-01', '2018-08', '2019-02', '2017-07'] Length: 26, dtype: period[M]
data.yearly_MONTHS.value_counts()
2019-05 2503 2019-07 2439 2018-08 2416 2019-08 2253 2019-06 2223 2018-10 2219 2019-04 2192 2018-07 2112 2018-09 2096 2019-03 2052 2018-04 2035 2018-03 1992 2018-06 1850 2018-05 1845 2018-11 1737 2018-12 1715 2019-02 1573 2019-01 1341 2018-02 1281 2017-10 990 2017-09 961 2018-01 774 2017-12 670 2017-08 643 2017-11 455 2017-07 174 Freq: M, Name: yearly_MONTHS, dtype: int64
# cell['four_g']=cell.four_g.astype('category')
data['yearly_MONTHS']=data.yearly_MONTHS.astype('category')
data.drop(["arrival_date_full"], axis=1,inplace = True)
import pandas_profiling
pandas_profiling.ProfileReport(data)
# defining a function to compute different metrics to check performance of a classification model built using statsmodels
def model_performance_classification_statsmodels(
model, predictors, target, threshold=0.5
):
"""
Function to compute different metrics to check classification model performance
model: classifier
predictors: independent variables
target: dependent variable
threshold: threshold for classifying the observation as class 1
"""
# checking which probabilities are greater than threshold
pred_temp = model.predict(predictors) > threshold
# rounding off the above values to get classes
pred = np.round(pred_temp)
acc = accuracy_score(target, pred) # to compute Accuracy
recall = recall_score(target, pred) # to compute Recall
precision = precision_score(target, pred) # to compute Precision
f1 = f1_score(target, pred) # to compute F1-score
# creating a dataframe of metrics
df_perf = pd.DataFrame(
{"Accuracy": acc, "Recall": recall, "Precision": precision, "F1": f1,},
index=[0],
)
return df_perf
# defining a function to plot the confusion_matrix of a classification model
def confusion_matrix_statsmodels(model, predictors, target, threshold=0.5):
"""
To plot the confusion_matrix with percentages
model: classifier
predictors: independent variables
target: dependent variable
threshold: threshold for classifying the observation as class 1
"""
y_pred = model.predict(predictors) > threshold
cm = confusion_matrix(target, y_pred)
labels = np.asarray(
[
["{0:0.0f}".format(item) + "\n{0:.2%}".format(item / cm.flatten().sum())]
for item in cm.flatten()
]
).reshape(2, 2)
plt.figure(figsize=(6, 4))
sns.heatmap(cm, annot=labels, fmt="")
plt.ylabel("True label")
plt.xlabel("Predicted label")
# defining a function to plot the confusion_matrix of a classification model
def confusion_matrix_statsmodels(model, predictors, target, threshold=0.5):
"""
To plot the confusion_matrix with percentages
model: classifier
predictors: independent variables
target: dependent variable
threshold: threshold for classifying the observation as class 1
"""
y_pred = model.predict(predictors) > threshold
cm = confusion_matrix(target, y_pred)
labels = np.asarray(
[
["{0:0.0f}".format(item) + "\n{0:.2%}".format(item / cm.flatten().sum())]
for item in cm.flatten()
]
).reshape(2, 2)
plt.figure(figsize=(6, 4))
sns.heatmap(cm, annot=labels, fmt="")
plt.ylabel("True label")
plt.xlabel("Predicted label")
data
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | type_of_meal_plan | required_car_parking_space | room_type_reserved | lead_time | market_segment_type | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | booking_status | Day of the Week | yearly_MONTHS | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 0 | 1 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 224 | Offline | 0 | 0 | 0 | 65.00 | 0 | 0 | 0 | 2017-10 |
| 1 | 2 | 0 | 2 | 3 | Not Selected | 0 | Room_Type 1 | 5 | Online | 0 | 0 | 0 | 106.68 | 1 | 0 | 1 | 2018-11 |
| 2 | 2 | 0 | 2 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 1 | Online | 0 | 0 | 0 | 60.00 | 0 | 1 | 2 | 2018-02 |
| 3 | 2 | 0 | 0 | 2 | Meal Plan 1 | 0 | Room_Type 1 | 211 | Online | 0 | 0 | 0 | 100.00 | 0 | 1 | 6 | 2018-05 |
| 4 | 2 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 277 | Online | 0 | 0 | 0 | 89.10 | 2 | 1 | 5 | 2019-07 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 56920 | 2 | 0 | 2 | 6 | Meal Plan 1 | 0 | Room_Type 1 | 148 | Online | 0 | 0 | 0 | 98.39 | 2 | 0 | 6 | 2018-07 |
| 56921 | 2 | 0 | 0 | 1 | Meal Plan 2 | 0 | Room_Type 4 | 45 | Online | 0 | 0 | 0 | 163.88 | 1 | 0 | 5 | 2019-06 |
| 56922 | 2 | 0 | 1 | 1 | Meal Plan 1 | 0 | Room_Type 1 | 320 | Offline | 0 | 0 | 0 | 90.00 | 1 | 1 | 2 | 2019-05 |
| 56923 | 2 | 0 | 0 | 3 | Not Selected | 0 | Room_Type 1 | 63 | Online | 0 | 0 | 0 | 94.50 | 0 | 1 | 5 | 2018-04 |
| 56924 | 2 | 0 | 2 | 2 | Not Selected | 0 | Room_Type 1 | 6 | Online | 0 | 0 | 0 | 162.50 | 2 | 0 | 6 | 2019-04 |
42541 rows × 17 columns
# Splitting the target from predictors
X = data.drop(["booking_status"], axis=1)
y = data["booking_status"]
X = pd.get_dummies(X, drop_first = True)
X.head()
| no_of_adults | no_of_children | no_of_weekend_nights | no_of_week_nights | required_car_parking_space | lead_time | repeated_guest | no_of_previous_cancellations | no_of_previous_bookings_not_canceled | avg_price_per_room | no_of_special_requests | Day of the Week | type_of_meal_plan_Meal Plan 2 | type_of_meal_plan_Meal Plan 3 | type_of_meal_plan_Not Selected | room_type_reserved_Room_Type 2 | room_type_reserved_Room_Type 3 | room_type_reserved_Room_Type 4 | room_type_reserved_Room_Type 5 | room_type_reserved_Room_Type 6 | room_type_reserved_Room_Type 7 | market_segment_type_Complementary | market_segment_type_Corporate | market_segment_type_Offline | market_segment_type_Online | yearly_MONTHS_2017-08 | yearly_MONTHS_2017-09 | yearly_MONTHS_2017-10 | yearly_MONTHS_2017-11 | yearly_MONTHS_2017-12 | yearly_MONTHS_2018-01 | yearly_MONTHS_2018-02 | yearly_MONTHS_2018-03 | yearly_MONTHS_2018-04 | yearly_MONTHS_2018-05 | yearly_MONTHS_2018-06 | yearly_MONTHS_2018-07 | yearly_MONTHS_2018-08 | yearly_MONTHS_2018-09 | yearly_MONTHS_2018-10 | yearly_MONTHS_2018-11 | yearly_MONTHS_2018-12 | yearly_MONTHS_2019-01 | yearly_MONTHS_2019-02 | yearly_MONTHS_2019-03 | yearly_MONTHS_2019-04 | yearly_MONTHS_2019-05 | yearly_MONTHS_2019-06 | yearly_MONTHS_2019-07 | yearly_MONTHS_2019-08 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 0 | 1 | 2 | 0 | 224 | 0 | 0 | 0 | 65.00 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 1 | 2 | 0 | 2 | 3 | 0 | 5 | 0 | 0 | 0 | 106.68 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2 | 2 | 0 | 2 | 1 | 0 | 1 | 0 | 0 | 0 | 60.00 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 2 | 0 | 0 | 2 | 0 | 211 | 0 | 0 | 0 | 100.00 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4 | 2 | 0 | 0 | 3 | 0 | 277 | 0 | 0 | 0 | 89.10 | 2 | 5 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
X.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 42541 entries, 0 to 56924 Data columns (total 50 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 no_of_adults 42541 non-null int64 1 no_of_children 42541 non-null int64 2 no_of_weekend_nights 42541 non-null int64 3 no_of_week_nights 42541 non-null int64 4 required_car_parking_space 42541 non-null int64 5 lead_time 42541 non-null int64 6 repeated_guest 42541 non-null int64 7 no_of_previous_cancellations 42541 non-null int64 8 no_of_previous_bookings_not_canceled 42541 non-null int64 9 avg_price_per_room 42541 non-null float64 10 no_of_special_requests 42541 non-null int64 11 Day of the Week 42541 non-null int64 12 type_of_meal_plan_Meal Plan 2 42541 non-null uint8 13 type_of_meal_plan_Meal Plan 3 42541 non-null uint8 14 type_of_meal_plan_Not Selected 42541 non-null uint8 15 room_type_reserved_Room_Type 2 42541 non-null uint8 16 room_type_reserved_Room_Type 3 42541 non-null uint8 17 room_type_reserved_Room_Type 4 42541 non-null uint8 18 room_type_reserved_Room_Type 5 42541 non-null uint8 19 room_type_reserved_Room_Type 6 42541 non-null uint8 20 room_type_reserved_Room_Type 7 42541 non-null uint8 21 market_segment_type_Complementary 42541 non-null uint8 22 market_segment_type_Corporate 42541 non-null uint8 23 market_segment_type_Offline 42541 non-null uint8 24 market_segment_type_Online 42541 non-null uint8 25 yearly_MONTHS_2017-08 42541 non-null uint8 26 yearly_MONTHS_2017-09 42541 non-null uint8 27 yearly_MONTHS_2017-10 42541 non-null uint8 28 yearly_MONTHS_2017-11 42541 non-null uint8 29 yearly_MONTHS_2017-12 42541 non-null uint8 30 yearly_MONTHS_2018-01 42541 non-null uint8 31 yearly_MONTHS_2018-02 42541 non-null uint8 32 yearly_MONTHS_2018-03 42541 non-null uint8 33 yearly_MONTHS_2018-04 42541 non-null uint8 34 yearly_MONTHS_2018-05 42541 non-null uint8 35 yearly_MONTHS_2018-06 42541 non-null uint8 36 yearly_MONTHS_2018-07 42541 non-null uint8 37 yearly_MONTHS_2018-08 42541 non-null uint8 38 yearly_MONTHS_2018-09 42541 non-null uint8 39 yearly_MONTHS_2018-10 42541 non-null uint8 40 yearly_MONTHS_2018-11 42541 non-null uint8 41 yearly_MONTHS_2018-12 42541 non-null uint8 42 yearly_MONTHS_2019-01 42541 non-null uint8 43 yearly_MONTHS_2019-02 42541 non-null uint8 44 yearly_MONTHS_2019-03 42541 non-null uint8 45 yearly_MONTHS_2019-04 42541 non-null uint8 46 yearly_MONTHS_2019-05 42541 non-null uint8 47 yearly_MONTHS_2019-06 42541 non-null uint8 48 yearly_MONTHS_2019-07 42541 non-null uint8 49 yearly_MONTHS_2019-08 42541 non-null uint8 dtypes: float64(1), int64(11), uint8(38) memory usage: 7.0 MB
vif_series = pd.Series(
[variance_inflation_factor(X.values, i) for i in range(X.shape[1])],
index=X.columns,
dtype=float,
)
print("Series before feature selection: \n\n{}\n".format(vif_series))
Series before feature selection: no_of_adults 480.801630 no_of_children NaN no_of_weekend_nights 1.374636 no_of_week_nights 1.212010 required_car_parking_space NaN lead_time 1.479795 repeated_guest NaN no_of_previous_cancellations NaN no_of_previous_bookings_not_canceled NaN avg_price_per_room 3.142208 no_of_special_requests 1.097756 Day of the Week 1.333526 type_of_meal_plan_Meal Plan 2 1.116276 type_of_meal_plan_Meal Plan 3 1.021228 type_of_meal_plan_Not Selected 1.340802 room_type_reserved_Room_Type 2 1.040140 room_type_reserved_Room_Type 3 1.001389 room_type_reserved_Room_Type 4 1.363099 room_type_reserved_Room_Type 5 1.125480 room_type_reserved_Room_Type 6 1.450405 room_type_reserved_Room_Type 7 1.138138 market_segment_type_Complementary 3.789403 market_segment_type_Corporate 10.577558 market_segment_type_Offline 26.953910 market_segment_type_Online 35.743082 yearly_MONTHS_2017-08 4.645434 yearly_MONTHS_2017-09 6.429023 yearly_MONTHS_2017-10 6.579641 yearly_MONTHS_2017-11 3.605965 yearly_MONTHS_2017-12 4.809834 yearly_MONTHS_2018-01 5.386442 yearly_MONTHS_2018-02 8.170394 yearly_MONTHS_2018-03 11.937924 yearly_MONTHS_2018-04 12.186378 yearly_MONTHS_2018-05 11.268103 yearly_MONTHS_2018-06 11.267723 yearly_MONTHS_2018-07 12.666867 yearly_MONTHS_2018-08 14.280371 yearly_MONTHS_2018-09 12.658169 yearly_MONTHS_2018-10 13.236154 yearly_MONTHS_2018-11 10.615876 yearly_MONTHS_2018-12 10.484377 yearly_MONTHS_2019-01 8.474761 yearly_MONTHS_2019-02 9.728387 yearly_MONTHS_2019-03 12.260397 yearly_MONTHS_2019-04 13.188648 yearly_MONTHS_2019-05 14.920306 yearly_MONTHS_2019-06 13.405707 yearly_MONTHS_2019-07 14.548825 yearly_MONTHS_2019-08 13.677357 dtype: float64
These variables :'no_of_children','repeated_guest',"required_car_parking_space",'no_of_previous_cancellations','no_of_previous_bookings_not_canceled' are resulting in NAN when applied vif, so I'll revome these.
#Removing Multicollinearity
col_remove = ['no_of_children','repeated_guest',"required_car_parking_space",'no_of_previous_cancellations','no_of_previous_bookings_not_canceled' ]
X = X.drop(col_remove, axis=1,)
vif_series = pd.Series(
[variance_inflation_factor(X.values, i) for i in range(X.shape[1])],
index=X.columns,
dtype=float,
)
print("Series before feature selection: \n\n{}\n".format(vif_series))
Series before feature selection: no_of_adults 480.801630 no_of_weekend_nights 1.374636 no_of_week_nights 1.212010 lead_time 1.479795 avg_price_per_room 3.142208 no_of_special_requests 1.097756 Day of the Week 1.333526 type_of_meal_plan_Meal Plan 2 1.116276 type_of_meal_plan_Meal Plan 3 1.021228 type_of_meal_plan_Not Selected 1.340802 room_type_reserved_Room_Type 2 1.040140 room_type_reserved_Room_Type 3 1.001389 room_type_reserved_Room_Type 4 1.363099 room_type_reserved_Room_Type 5 1.125480 room_type_reserved_Room_Type 6 1.450405 room_type_reserved_Room_Type 7 1.138138 market_segment_type_Complementary 3.789403 market_segment_type_Corporate 10.577558 market_segment_type_Offline 26.953910 market_segment_type_Online 35.743082 yearly_MONTHS_2017-08 4.645434 yearly_MONTHS_2017-09 6.429023 yearly_MONTHS_2017-10 6.579641 yearly_MONTHS_2017-11 3.605965 yearly_MONTHS_2017-12 4.809834 yearly_MONTHS_2018-01 5.386442 yearly_MONTHS_2018-02 8.170394 yearly_MONTHS_2018-03 11.937924 yearly_MONTHS_2018-04 12.186378 yearly_MONTHS_2018-05 11.268103 yearly_MONTHS_2018-06 11.267723 yearly_MONTHS_2018-07 12.666867 yearly_MONTHS_2018-08 14.280371 yearly_MONTHS_2018-09 12.658169 yearly_MONTHS_2018-10 13.236154 yearly_MONTHS_2018-11 10.615876 yearly_MONTHS_2018-12 10.484377 yearly_MONTHS_2019-01 8.474761 yearly_MONTHS_2019-02 9.728387 yearly_MONTHS_2019-03 12.260397 yearly_MONTHS_2019-04 13.188648 yearly_MONTHS_2019-05 14.920306 yearly_MONTHS_2019-06 13.405707 yearly_MONTHS_2019-07 14.548825 yearly_MONTHS_2019-08 13.677357 dtype: float64
#Removing Multicollinearity
col_remove = ['no_of_adults','market_segment_type_Offline' ]
X = X.drop(col_remove, axis=1,)
vif_series = pd.Series(
[variance_inflation_factor(X.values, i) for i in range(X.shape[1])],
index=X.columns,
dtype=float,
)
print("Series before feature selection: \n\n{}\n".format(vif_series))
Series before feature selection: no_of_weekend_nights 2.733855 no_of_week_nights 4.224263 lead_time 2.916432 avg_price_per_room 26.353510 no_of_special_requests 2.024628 Day of the Week 4.273069 type_of_meal_plan_Meal Plan 2 1.170414 type_of_meal_plan_Meal Plan 3 1.021418 type_of_meal_plan_Not Selected 1.683376 room_type_reserved_Room_Type 2 1.057852 room_type_reserved_Room_Type 3 1.001528 room_type_reserved_Room_Type 4 1.732824 room_type_reserved_Room_Type 5 1.147140 room_type_reserved_Room_Type 6 1.489206 room_type_reserved_Room_Type 7 1.142764 market_segment_type_Complementary 1.359042 market_segment_type_Corporate 1.391719 market_segment_type_Online 8.782936 yearly_MONTHS_2017-08 1.329295 yearly_MONTHS_2017-09 1.565355 yearly_MONTHS_2017-10 1.492861 yearly_MONTHS_2017-11 1.176909 yearly_MONTHS_2017-12 1.303241 yearly_MONTHS_2018-01 1.313814 yearly_MONTHS_2018-02 1.541629 yearly_MONTHS_2018-03 2.039171 yearly_MONTHS_2018-04 2.204071 yearly_MONTHS_2018-05 2.270642 yearly_MONTHS_2018-06 2.242922 yearly_MONTHS_2018-07 2.574963 yearly_MONTHS_2018-08 2.923821 yearly_MONTHS_2018-09 2.682193 yearly_MONTHS_2018-10 2.585414 yearly_MONTHS_2018-11 2.050693 yearly_MONTHS_2018-12 2.047680 yearly_MONTHS_2019-01 1.736283 yearly_MONTHS_2019-02 1.869963 yearly_MONTHS_2019-03 2.167479 yearly_MONTHS_2019-04 2.873150 yearly_MONTHS_2019-05 3.343459 yearly_MONTHS_2019-06 3.022961 yearly_MONTHS_2019-07 3.261569 yearly_MONTHS_2019-08 3.338658 dtype: float64
X_train, X_test,y_train, y_test = train_test_split(X, y,test_size=0.4, random_state=1)
print("Number of rows in train data = ", X_train.shape[0])
print("Number of rows in test data = ", X_test.shape[0])
Number of rows in train data = 25524 Number of rows in test data = 17017
print("Percentage of classes in training set: ")
print(y_train.value_counts(normalize= True))
print("Percentage of classes in test set: ")
print(y_test.value_counts(normalize = True))
Percentage of classes in training set: 0 0.65981 1 0.34019 Name: booking_status, dtype: float64 Percentage of classes in test set: 0 0.659341 1 0.340659 Name: booking_status, dtype: float64
print("Original Canceled True Values : {0} ({1:0.2f}%)".format(len(data.loc[data['booking_status'] == 1]), (len(data.loc[data['booking_status'] == 1])/len(data.index)) * 100))
print("Original Canceled False Values : {0} ({1:0.2f}%)".format(len(data.loc[data['booking_status'] == 0]), (len(data.loc[data['booking_status'] == 0])/len(data.index)) * 100))
print("")
print("Training Canceled True Values : {0} ({1:0.2f}%)".format(len(y_train[y_train[:] == 1]), (len(y_train[y_train[:] == 1])/len(y_train)) * 100))
print("Training Canceled False Values : {0} ({1:0.2f}%)".format(len(y_train[y_train[:] == 0]), (len(y_train[y_train[:] == 0])/len(y_train)) * 100))
print("")
print("Test Canceled True Values : {0} ({1:0.2f}%)".format(len(y_test[y_test[:] == 1]), (len(y_test[y_test[:] == 1])/len(y_test)) * 100))
print("Test Canceled Values : {0} ({1:0.2f}%)".format(len(y_test[y_test[:] == 0]), (len(y_test[y_test[:] == 0])/len(y_test)) * 100))
print("")
Original Canceled True Values : 14480 (34.04%) Original Canceled False Values : 28061 (65.96%) Training Canceled True Values : 8683 (34.02%) Training Canceled False Values : 16841 (65.98%) Test Canceled True Values : 5797 (34.07%) Test Canceled Values : 11220 (65.93%)
X.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 42541 entries, 0 to 56924 Data columns (total 43 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 no_of_weekend_nights 42541 non-null int64 1 no_of_week_nights 42541 non-null int64 2 lead_time 42541 non-null int64 3 avg_price_per_room 42541 non-null float64 4 no_of_special_requests 42541 non-null int64 5 Day of the Week 42541 non-null int64 6 type_of_meal_plan_Meal Plan 2 42541 non-null uint8 7 type_of_meal_plan_Meal Plan 3 42541 non-null uint8 8 type_of_meal_plan_Not Selected 42541 non-null uint8 9 room_type_reserved_Room_Type 2 42541 non-null uint8 10 room_type_reserved_Room_Type 3 42541 non-null uint8 11 room_type_reserved_Room_Type 4 42541 non-null uint8 12 room_type_reserved_Room_Type 5 42541 non-null uint8 13 room_type_reserved_Room_Type 6 42541 non-null uint8 14 room_type_reserved_Room_Type 7 42541 non-null uint8 15 market_segment_type_Complementary 42541 non-null uint8 16 market_segment_type_Corporate 42541 non-null uint8 17 market_segment_type_Online 42541 non-null uint8 18 yearly_MONTHS_2017-08 42541 non-null uint8 19 yearly_MONTHS_2017-09 42541 non-null uint8 20 yearly_MONTHS_2017-10 42541 non-null uint8 21 yearly_MONTHS_2017-11 42541 non-null uint8 22 yearly_MONTHS_2017-12 42541 non-null uint8 23 yearly_MONTHS_2018-01 42541 non-null uint8 24 yearly_MONTHS_2018-02 42541 non-null uint8 25 yearly_MONTHS_2018-03 42541 non-null uint8 26 yearly_MONTHS_2018-04 42541 non-null uint8 27 yearly_MONTHS_2018-05 42541 non-null uint8 28 yearly_MONTHS_2018-06 42541 non-null uint8 29 yearly_MONTHS_2018-07 42541 non-null uint8 30 yearly_MONTHS_2018-08 42541 non-null uint8 31 yearly_MONTHS_2018-09 42541 non-null uint8 32 yearly_MONTHS_2018-10 42541 non-null uint8 33 yearly_MONTHS_2018-11 42541 non-null uint8 34 yearly_MONTHS_2018-12 42541 non-null uint8 35 yearly_MONTHS_2019-01 42541 non-null uint8 36 yearly_MONTHS_2019-02 42541 non-null uint8 37 yearly_MONTHS_2019-03 42541 non-null uint8 38 yearly_MONTHS_2019-04 42541 non-null uint8 39 yearly_MONTHS_2019-05 42541 non-null uint8 40 yearly_MONTHS_2019-06 42541 non-null uint8 41 yearly_MONTHS_2019-07 42541 non-null uint8 42 yearly_MONTHS_2019-08 42541 non-null uint8 dtypes: float64(1), int64(5), uint8(37) memory usage: 5.0 MB
# There are different solvers available in Sklearn logistic regression
# The newton-cg solver is faster for high-dimensional data
model = LogisticRegression(solver="newton-cg", random_state=1)
lg = model.fit(X_train, y_train)
# predicting on training set
y_pred_train = lg.predict(X_train)
print("Training set performance:")
print("Accuracy:", accuracy_score(y_train, y_pred_train))
print("Precision:", precision_score(y_train, y_pred_train))
print("Recall:", recall_score(y_train, y_pred_train))
print("F1:", f1_score(y_train, y_pred_train))
Training set performance: Accuracy: 0.7937235543018336 Precision: 0.7287817938420348 Recall: 0.6269722446159162 F1: 0.6740543552281311
# predicting on the test set
y_pred_test = lg.predict(X_test)
print("Test set performance:")
print("Accuracy:", accuracy_score(y_test, y_pred_test))
print("Precision:", precision_score(y_test, y_pred_test))
print("Recall:", recall_score(y_test, y_pred_test))
print("F1:", f1_score(y_test, y_pred_test))
Test set performance: Accuracy: 0.7966739143209731 Precision: 0.7344972907886815 Recall: 0.6313610488183543 F1: 0.6790352504638218
Model can make wrong predictions as:
Predicting a customer will not cancel the booking but in reality the customer would cancel.
Predicting a customer will cancel booking but in reality the customer would not cancel.
Which case is more important?
# Splitting the target from predictors
X1 = data.drop(["booking_status"], axis=1)
Y1 = data["booking_status"].astype("int64")
X1.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 42541 entries, 0 to 56924 Data columns (total 16 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 no_of_adults 42541 non-null int64 1 no_of_children 42541 non-null int64 2 no_of_weekend_nights 42541 non-null int64 3 no_of_week_nights 42541 non-null int64 4 type_of_meal_plan 42541 non-null category 5 required_car_parking_space 42541 non-null int64 6 room_type_reserved 42541 non-null category 7 lead_time 42541 non-null int64 8 market_segment_type 42541 non-null category 9 repeated_guest 42541 non-null int64 10 no_of_previous_cancellations 42541 non-null int64 11 no_of_previous_bookings_not_canceled 42541 non-null int64 12 avg_price_per_room 42541 non-null float64 13 no_of_special_requests 42541 non-null int64 14 Day of the Week 42541 non-null int64 15 yearly_MONTHS 42541 non-null category dtypes: category(4), float64(1), int64(11) memory usage: 5.6 MB
X1 = X1.drop(['no_of_adults','no_of_children','repeated_guest',"required_car_parking_space",'no_of_previous_cancellations','no_of_previous_bookings_not_canceled'], axis=1)
X1 = pd.get_dummies(X1, drop_first = True)
X1.head()
| no_of_weekend_nights | no_of_week_nights | lead_time | avg_price_per_room | no_of_special_requests | Day of the Week | type_of_meal_plan_Meal Plan 2 | type_of_meal_plan_Meal Plan 3 | type_of_meal_plan_Not Selected | room_type_reserved_Room_Type 2 | room_type_reserved_Room_Type 3 | room_type_reserved_Room_Type 4 | room_type_reserved_Room_Type 5 | room_type_reserved_Room_Type 6 | room_type_reserved_Room_Type 7 | market_segment_type_Complementary | market_segment_type_Corporate | market_segment_type_Offline | market_segment_type_Online | yearly_MONTHS_2017-08 | yearly_MONTHS_2017-09 | yearly_MONTHS_2017-10 | yearly_MONTHS_2017-11 | yearly_MONTHS_2017-12 | yearly_MONTHS_2018-01 | yearly_MONTHS_2018-02 | yearly_MONTHS_2018-03 | yearly_MONTHS_2018-04 | yearly_MONTHS_2018-05 | yearly_MONTHS_2018-06 | yearly_MONTHS_2018-07 | yearly_MONTHS_2018-08 | yearly_MONTHS_2018-09 | yearly_MONTHS_2018-10 | yearly_MONTHS_2018-11 | yearly_MONTHS_2018-12 | yearly_MONTHS_2019-01 | yearly_MONTHS_2019-02 | yearly_MONTHS_2019-03 | yearly_MONTHS_2019-04 | yearly_MONTHS_2019-05 | yearly_MONTHS_2019-06 | yearly_MONTHS_2019-07 | yearly_MONTHS_2019-08 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1 | 2 | 224 | 65.00 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 1 | 2 | 3 | 5 | 106.68 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2 | 2 | 1 | 1 | 60.00 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 0 | 2 | 211 | 100.00 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4 | 0 | 3 | 277 | 89.10 | 2 | 5 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
X1.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 42541 entries, 0 to 56924 Data columns (total 44 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 no_of_weekend_nights 42541 non-null int64 1 no_of_week_nights 42541 non-null int64 2 lead_time 42541 non-null int64 3 avg_price_per_room 42541 non-null float64 4 no_of_special_requests 42541 non-null int64 5 Day of the Week 42541 non-null int64 6 type_of_meal_plan_Meal Plan 2 42541 non-null uint8 7 type_of_meal_plan_Meal Plan 3 42541 non-null uint8 8 type_of_meal_plan_Not Selected 42541 non-null uint8 9 room_type_reserved_Room_Type 2 42541 non-null uint8 10 room_type_reserved_Room_Type 3 42541 non-null uint8 11 room_type_reserved_Room_Type 4 42541 non-null uint8 12 room_type_reserved_Room_Type 5 42541 non-null uint8 13 room_type_reserved_Room_Type 6 42541 non-null uint8 14 room_type_reserved_Room_Type 7 42541 non-null uint8 15 market_segment_type_Complementary 42541 non-null uint8 16 market_segment_type_Corporate 42541 non-null uint8 17 market_segment_type_Offline 42541 non-null uint8 18 market_segment_type_Online 42541 non-null uint8 19 yearly_MONTHS_2017-08 42541 non-null uint8 20 yearly_MONTHS_2017-09 42541 non-null uint8 21 yearly_MONTHS_2017-10 42541 non-null uint8 22 yearly_MONTHS_2017-11 42541 non-null uint8 23 yearly_MONTHS_2017-12 42541 non-null uint8 24 yearly_MONTHS_2018-01 42541 non-null uint8 25 yearly_MONTHS_2018-02 42541 non-null uint8 26 yearly_MONTHS_2018-03 42541 non-null uint8 27 yearly_MONTHS_2018-04 42541 non-null uint8 28 yearly_MONTHS_2018-05 42541 non-null uint8 29 yearly_MONTHS_2018-06 42541 non-null uint8 30 yearly_MONTHS_2018-07 42541 non-null uint8 31 yearly_MONTHS_2018-08 42541 non-null uint8 32 yearly_MONTHS_2018-09 42541 non-null uint8 33 yearly_MONTHS_2018-10 42541 non-null uint8 34 yearly_MONTHS_2018-11 42541 non-null uint8 35 yearly_MONTHS_2018-12 42541 non-null uint8 36 yearly_MONTHS_2019-01 42541 non-null uint8 37 yearly_MONTHS_2019-02 42541 non-null uint8 38 yearly_MONTHS_2019-03 42541 non-null uint8 39 yearly_MONTHS_2019-04 42541 non-null uint8 40 yearly_MONTHS_2019-05 42541 non-null uint8 41 yearly_MONTHS_2019-06 42541 non-null uint8 42 yearly_MONTHS_2019-07 42541 non-null uint8 43 yearly_MONTHS_2019-08 42541 non-null uint8 dtypes: float64(1), int64(5), uint8(38) memory usage: 5.1 MB
from sklearn import metrics
from sklearn.linear_model import LogisticRegression
# creating dummy variables
#X = pd.get_dummies(X, drop_first=True)
# adding constant
X1 = sm.add_constant(X1)
# splitting in training and test set
X_train1, X_test1, y_train1, y_test1 = train_test_split(X1, Y1, test_size=0.3, random_state=1)
#X_train1.info()
#X_train1.columns
X_train1=X_train1.astype(float)
#X_train1.info()
logit = sm.Logit(y_train1, X_train1) #.astype(float)
lg = logit.fit(
disp=False
) # setting disp=False will remove the information on number of iterations
print(lg.summary())
Logit Regression Results
==============================================================================
Dep. Variable: booking_status No. Observations: 29778
Model: Logit Df Residuals: 29733
Method: MLE Df Model: 44
Date: Sat, 18 Sep 2021 Pseudo R-squ.: 0.3379
Time: 02:00:10 Log-Likelihood: -12650.
converged: False LL-Null: -19107.
Covariance Type: nonrobust LLR p-value: 0.000
=====================================================================================================
coef std err z P>|z| [0.025 0.975]
-----------------------------------------------------------------------------------------------------
const -1.6746 0.357 -4.687 0.000 -2.375 -0.974
no_of_weekend_nights 0.0586 0.020 2.868 0.004 0.019 0.099
no_of_week_nights 0.0978 0.012 8.376 0.000 0.075 0.121
lead_time 0.0170 0.000 58.442 0.000 0.016 0.018
avg_price_per_room 0.0187 0.001 25.144 0.000 0.017 0.020
no_of_special_requests -1.3596 0.024 -56.579 0.000 -1.407 -1.312
Day of the Week -0.0028 0.009 -0.317 0.751 -0.020 0.014
type_of_meal_plan_Meal Plan 2 -0.1986 0.081 -2.454 0.014 -0.357 -0.040
type_of_meal_plan_Meal Plan 3 41.1629 2.04e+11 2.01e-10 1.000 -4.01e+11 4.01e+11
type_of_meal_plan_Not Selected 0.3402 0.044 7.800 0.000 0.255 0.426
room_type_reserved_Room_Type 2 -0.2263 0.130 -1.747 0.081 -0.480 0.028
room_type_reserved_Room_Type 3 -12.1712 638.308 -0.019 0.985 -1263.232 1238.889
room_type_reserved_Room_Type 4 -0.1783 0.044 -4.075 0.000 -0.264 -0.093
room_type_reserved_Room_Type 5 -0.3891 0.113 -3.431 0.001 -0.611 -0.167
room_type_reserved_Room_Type 6 -0.4921 0.101 -4.871 0.000 -0.690 -0.294
room_type_reserved_Room_Type 7 -0.9170 0.203 -4.511 0.000 -1.315 -0.519
market_segment_type_Complementary -52.5583 2.04e+11 -2.57e-10 1.000 -4.01e+11 4.01e+11
market_segment_type_Corporate -0.2216 0.274 -0.808 0.419 -0.760 0.316
market_segment_type_Offline -1.5947 0.262 -6.077 0.000 -2.109 -1.080
market_segment_type_Online 0.6154 0.256 2.401 0.016 0.113 1.118
yearly_MONTHS_2017-08 -1.8434 0.279 -6.603 0.000 -2.391 -1.296
yearly_MONTHS_2017-09 -2.7027 0.278 -9.734 0.000 -3.247 -2.158
yearly_MONTHS_2017-10 -3.1482 0.291 -10.807 0.000 -3.719 -2.577
yearly_MONTHS_2017-11 -2.7704 0.363 -7.641 0.000 -3.481 -2.060
yearly_MONTHS_2017-12 -3.7177 0.357 -10.411 0.000 -4.418 -3.018
yearly_MONTHS_2018-01 -4.1887 0.404 -10.361 0.000 -4.981 -3.396
yearly_MONTHS_2018-02 -1.5728 0.259 -6.067 0.000 -2.081 -1.065
yearly_MONTHS_2018-03 -1.7816 0.252 -7.056 0.000 -2.276 -1.287
yearly_MONTHS_2018-04 -2.1409 0.252 -8.499 0.000 -2.635 -1.647
yearly_MONTHS_2018-05 -2.4408 0.255 -9.587 0.000 -2.940 -1.942
yearly_MONTHS_2018-06 -2.2601 0.255 -8.877 0.000 -2.759 -1.761
yearly_MONTHS_2018-07 -2.4791 0.253 -9.792 0.000 -2.975 -1.983
yearly_MONTHS_2018-08 -2.3535 0.253 -9.315 0.000 -2.849 -1.858
yearly_MONTHS_2018-09 -2.1046 0.255 -8.257 0.000 -2.604 -1.605
yearly_MONTHS_2018-10 -2.0724 0.254 -8.162 0.000 -2.570 -1.575
yearly_MONTHS_2018-11 -1.7315 0.255 -6.779 0.000 -2.232 -1.231
yearly_MONTHS_2018-12 -3.1215 0.262 -11.914 0.000 -3.635 -2.608
yearly_MONTHS_2019-01 -2.5966 0.264 -9.851 0.000 -3.113 -2.080
yearly_MONTHS_2019-02 -1.5621 0.255 -6.131 0.000 -2.061 -1.063
yearly_MONTHS_2019-03 -1.9732 0.252 -7.817 0.000 -2.468 -1.478
yearly_MONTHS_2019-04 -2.2016 0.254 -8.675 0.000 -2.699 -1.704
yearly_MONTHS_2019-05 -2.4398 0.254 -9.587 0.000 -2.939 -1.941
yearly_MONTHS_2019-06 -2.5702 0.255 -10.067 0.000 -3.071 -2.070
yearly_MONTHS_2019-07 -2.2545 0.254 -8.862 0.000 -2.753 -1.756
yearly_MONTHS_2019-08 -2.2573 0.257 -8.792 0.000 -2.761 -1.754
=====================================================================================================
X_train2 = X_train1.drop(["market_segment_type_Complementary", "room_type_reserved_Room_Type 3","Day of the Week"], axis=1,)
logit1 = sm.Logit(y_train1, X_train2.astype(float))
lg1 = logit1.fit(disp=False)
print(lg1.summary())
Logit Regression Results
==============================================================================
Dep. Variable: booking_status No. Observations: 29778
Model: Logit Df Residuals: 29736
Method: MLE Df Model: 41
Date: Sat, 18 Sep 2021 Pseudo R-squ.: 0.3376
Time: 02:00:13 Log-Likelihood: -12657.
converged: True LL-Null: -19107.
Covariance Type: nonrobust LLR p-value: 0.000
==================================================================================================
coef std err z P>|z| [0.025 0.975]
--------------------------------------------------------------------------------------------------
const -2.0522 0.347 -5.912 0.000 -2.732 -1.372
no_of_weekend_nights 0.0626 0.018 3.465 0.001 0.027 0.098
no_of_week_nights 0.0977 0.011 8.632 0.000 0.076 0.120
lead_time 0.0170 0.000 58.573 0.000 0.016 0.018
avg_price_per_room 0.0188 0.001 25.522 0.000 0.017 0.020
no_of_special_requests -1.3606 0.024 -56.639 0.000 -1.408 -1.314
type_of_meal_plan_Meal Plan 2 -0.2029 0.081 -2.509 0.012 -0.361 -0.044
type_of_meal_plan_Meal Plan 3 2.7338 1.675 1.632 0.103 -0.549 6.017
type_of_meal_plan_Not Selected 0.3428 0.044 7.863 0.000 0.257 0.428
room_type_reserved_Room_Type 2 -0.2244 0.129 -1.733 0.083 -0.478 0.029
room_type_reserved_Room_Type 4 -0.1797 0.044 -4.106 0.000 -0.265 -0.094
room_type_reserved_Room_Type 5 -0.3980 0.113 -3.513 0.000 -0.620 -0.176
room_type_reserved_Room_Type 6 -0.5039 0.101 -4.997 0.000 -0.702 -0.306
room_type_reserved_Room_Type 7 -0.9425 0.203 -4.653 0.000 -1.340 -0.546
market_segment_type_Corporate 0.1185 0.270 0.439 0.661 -0.411 0.648
market_segment_type_Offline -1.2561 0.258 -4.867 0.000 -1.762 -0.750
market_segment_type_Online 0.9503 0.253 3.761 0.000 0.455 1.445
yearly_MONTHS_2017-08 -1.8353 0.277 -6.623 0.000 -2.378 -1.292
yearly_MONTHS_2017-09 -2.6876 0.276 -9.750 0.000 -3.228 -2.147
yearly_MONTHS_2017-10 -3.1344 0.289 -10.831 0.000 -3.702 -2.567
yearly_MONTHS_2017-11 -2.7566 0.361 -7.635 0.000 -3.464 -2.049
yearly_MONTHS_2017-12 -3.7087 0.356 -10.431 0.000 -4.406 -3.012
yearly_MONTHS_2018-01 -4.1739 0.403 -10.358 0.000 -4.964 -3.384
yearly_MONTHS_2018-02 -1.5549 0.257 -6.048 0.000 -2.059 -1.051
yearly_MONTHS_2018-03 -1.7694 0.250 -7.067 0.000 -2.260 -1.279
yearly_MONTHS_2018-04 -2.1242 0.250 -8.504 0.000 -2.614 -1.635
yearly_MONTHS_2018-05 -2.4276 0.252 -9.616 0.000 -2.922 -1.933
yearly_MONTHS_2018-06 -2.2473 0.253 -8.899 0.000 -2.742 -1.752
yearly_MONTHS_2018-07 -2.4665 0.251 -9.826 0.000 -2.959 -1.975
yearly_MONTHS_2018-08 -2.3415 0.250 -9.347 0.000 -2.832 -1.850
yearly_MONTHS_2018-09 -2.0939 0.253 -8.284 0.000 -2.589 -1.599
yearly_MONTHS_2018-10 -2.0595 0.252 -8.180 0.000 -2.553 -1.566
yearly_MONTHS_2018-11 -1.7175 0.253 -6.780 0.000 -2.214 -1.221
yearly_MONTHS_2018-12 -3.1146 0.260 -11.980 0.000 -3.624 -2.605
yearly_MONTHS_2019-01 -2.5800 0.262 -9.863 0.000 -3.093 -2.067
yearly_MONTHS_2019-02 -1.5471 0.253 -6.122 0.000 -2.042 -1.052
yearly_MONTHS_2019-03 -1.9582 0.250 -7.822 0.000 -2.449 -1.468
yearly_MONTHS_2019-04 -2.1897 0.252 -8.701 0.000 -2.683 -1.696
yearly_MONTHS_2019-05 -2.4281 0.252 -9.622 0.000 -2.923 -1.934
yearly_MONTHS_2019-06 -2.5591 0.253 -10.107 0.000 -3.055 -2.063
yearly_MONTHS_2019-07 -2.2431 0.252 -8.893 0.000 -2.737 -1.749
yearly_MONTHS_2019-08 -2.2482 0.255 -8.830 0.000 -2.747 -1.749
==================================================================================================
Observations
"market_segment_type_Complementary", "room_type_reserved_Room_Type 3","Day of the Week,'market_segment_type_Corporate','type_of_meal_plan_Meal Plan 3', and many categorical levels of purpose have p-value > 0.05. So, they are not significant and we'll drop them.
But sometimes p-values change after dropping a variable. So, we'll not drop all variables at once.
Instead, we will do the following repeatedly using a loop:
X_train3 = X_train2.drop(['type_of_meal_plan_Meal Plan 3','market_segment_type_Corporate'], axis=1,)
vif_series = pd.Series(
[variance_inflation_factor(X_train3.values, i) for i in range(X_train3.shape[1])],
index=X_train3.columns,
dtype=float,
)
print("Series before feature selection: \n\n{}\n".format(vif_series))
Series before feature selection: const 287.841921 no_of_weekend_nights 1.073358 no_of_week_nights 1.134097 lead_time 1.462393 avg_price_per_room 2.719686 no_of_special_requests 1.088266 type_of_meal_plan_Meal Plan 2 1.103216 type_of_meal_plan_Not Selected 1.331395 room_type_reserved_Room_Type 2 1.042699 room_type_reserved_Room_Type 4 1.333539 room_type_reserved_Room_Type 5 1.108856 room_type_reserved_Room_Type 6 1.383516 room_type_reserved_Room_Type 7 1.071714 market_segment_type_Offline 3.145247 market_segment_type_Online 3.911511 yearly_MONTHS_2017-08 4.989669 yearly_MONTHS_2017-09 6.851361 yearly_MONTHS_2017-10 6.853289 yearly_MONTHS_2017-11 3.818060 yearly_MONTHS_2017-12 5.151988 yearly_MONTHS_2018-01 5.771931 yearly_MONTHS_2018-02 8.549540 yearly_MONTHS_2018-03 12.799578 yearly_MONTHS_2018-04 13.319354 yearly_MONTHS_2018-05 12.268496 yearly_MONTHS_2018-06 12.101216 yearly_MONTHS_2018-07 13.772674 yearly_MONTHS_2018-08 15.456017 yearly_MONTHS_2018-09 13.698168 yearly_MONTHS_2018-10 14.170494 yearly_MONTHS_2018-11 11.461575 yearly_MONTHS_2018-12 11.302667 yearly_MONTHS_2019-01 9.147003 yearly_MONTHS_2019-02 10.388643 yearly_MONTHS_2019-03 13.150295 yearly_MONTHS_2019-04 14.206605 yearly_MONTHS_2019-05 16.204065 yearly_MONTHS_2019-06 14.405782 yearly_MONTHS_2019-07 16.121126 yearly_MONTHS_2019-08 14.571103 dtype: float64
logit2 = sm.Logit(y_train1, X_train3.astype(float))
lg2 = logit2.fit(disp=False)
print(lg2.summary())
Logit Regression Results
==============================================================================
Dep. Variable: booking_status No. Observations: 29778
Model: Logit Df Residuals: 29738
Method: MLE Df Model: 39
Date: Sat, 18 Sep 2021 Pseudo R-squ.: 0.3375
Time: 02:00:19 Log-Likelihood: -12658.
converged: True LL-Null: -19107.
Covariance Type: nonrobust LLR p-value: 0.000
==================================================================================================
coef std err z P>|z| [0.025 0.975]
--------------------------------------------------------------------------------------------------
const -1.9526 0.263 -7.434 0.000 -2.467 -1.438
no_of_weekend_nights 0.0623 0.018 3.449 0.001 0.027 0.098
no_of_week_nights 0.0975 0.011 8.619 0.000 0.075 0.120
lead_time 0.0170 0.000 58.649 0.000 0.016 0.018
avg_price_per_room 0.0189 0.001 25.580 0.000 0.017 0.020
no_of_special_requests -1.3607 0.024 -56.643 0.000 -1.408 -1.314
type_of_meal_plan_Meal Plan 2 -0.2042 0.081 -2.527 0.012 -0.363 -0.046
type_of_meal_plan_Not Selected 0.3432 0.044 7.874 0.000 0.258 0.429
room_type_reserved_Room_Type 2 -0.2246 0.129 -1.735 0.083 -0.478 0.029
room_type_reserved_Room_Type 4 -0.1808 0.044 -4.135 0.000 -0.266 -0.095
room_type_reserved_Room_Type 5 -0.3984 0.113 -3.516 0.000 -0.620 -0.176
room_type_reserved_Room_Type 6 -0.5064 0.101 -5.023 0.000 -0.704 -0.309
room_type_reserved_Room_Type 7 -0.9387 0.203 -4.626 0.000 -1.336 -0.541
market_segment_type_Offline -1.3566 0.115 -11.795 0.000 -1.582 -1.131
market_segment_type_Online 0.8475 0.102 8.302 0.000 0.647 1.048
yearly_MONTHS_2017-08 -1.8344 0.277 -6.623 0.000 -2.377 -1.292
yearly_MONTHS_2017-09 -2.6762 0.275 -9.721 0.000 -3.216 -2.137
yearly_MONTHS_2017-10 -3.1336 0.289 -10.833 0.000 -3.701 -2.567
yearly_MONTHS_2017-11 -2.7543 0.361 -7.631 0.000 -3.462 -2.047
yearly_MONTHS_2017-12 -3.7077 0.355 -10.431 0.000 -4.404 -3.011
yearly_MONTHS_2018-01 -4.1715 0.403 -10.355 0.000 -4.961 -3.382
yearly_MONTHS_2018-02 -1.5532 0.257 -6.045 0.000 -2.057 -1.050
yearly_MONTHS_2018-03 -1.7679 0.250 -7.065 0.000 -2.258 -1.277
yearly_MONTHS_2018-04 -2.1245 0.250 -8.509 0.000 -2.614 -1.635
yearly_MONTHS_2018-05 -2.4278 0.252 -9.621 0.000 -2.922 -1.933
yearly_MONTHS_2018-06 -2.2475 0.252 -8.904 0.000 -2.742 -1.753
yearly_MONTHS_2018-07 -2.4666 0.251 -9.831 0.000 -2.958 -1.975
yearly_MONTHS_2018-08 -2.3419 0.250 -9.353 0.000 -2.833 -1.851
yearly_MONTHS_2018-09 -2.0947 0.253 -8.291 0.000 -2.590 -1.600
yearly_MONTHS_2018-10 -2.0602 0.252 -8.187 0.000 -2.553 -1.567
yearly_MONTHS_2018-11 -1.7170 0.253 -6.781 0.000 -2.213 -1.221
yearly_MONTHS_2018-12 -3.1144 0.260 -11.984 0.000 -3.624 -2.605
yearly_MONTHS_2019-01 -2.5793 0.261 -9.865 0.000 -3.092 -2.067
yearly_MONTHS_2019-02 -1.5461 0.253 -6.121 0.000 -2.041 -1.051
yearly_MONTHS_2019-03 -1.9576 0.250 -7.824 0.000 -2.448 -1.467
yearly_MONTHS_2019-04 -2.1908 0.252 -8.709 0.000 -2.684 -1.698
yearly_MONTHS_2019-05 -2.4297 0.252 -9.633 0.000 -2.924 -1.935
yearly_MONTHS_2019-06 -2.5605 0.253 -10.117 0.000 -3.057 -2.064
yearly_MONTHS_2019-07 -2.2439 0.252 -8.901 0.000 -2.738 -1.750
yearly_MONTHS_2019-08 -2.2494 0.255 -8.838 0.000 -2.748 -1.751
==================================================================================================
# initial list of columns
cols = X_train3.columns.tolist()
# setting an initial max p-value
max_p_value = 1
while len(cols) > 0:
# defining the train set
X_train_aux = X_train3[cols]
# fitting the model
model = sm.Logit(y_train1, X_train_aux).fit(disp=False)
# getting the p-values and the maximum p-value
p_values = model.pvalues
max_p_value = max(p_values)
# name of the variable with maximum p-value
feature_with_p_max = p_values.idxmax()
if max_p_value > 0.05:
cols.remove(feature_with_p_max)
else:
break
selected_features = cols
print(selected_features)
['const', 'no_of_weekend_nights', 'no_of_week_nights', 'lead_time', 'avg_price_per_room', 'no_of_special_requests', 'type_of_meal_plan_Meal Plan 2', 'type_of_meal_plan_Not Selected', 'room_type_reserved_Room_Type 4', 'room_type_reserved_Room_Type 5', 'room_type_reserved_Room_Type 6', 'room_type_reserved_Room_Type 7', 'market_segment_type_Offline', 'market_segment_type_Online', 'yearly_MONTHS_2017-08', 'yearly_MONTHS_2017-09', 'yearly_MONTHS_2017-10', 'yearly_MONTHS_2017-11', 'yearly_MONTHS_2017-12', 'yearly_MONTHS_2018-01', 'yearly_MONTHS_2018-02', 'yearly_MONTHS_2018-03', 'yearly_MONTHS_2018-04', 'yearly_MONTHS_2018-05', 'yearly_MONTHS_2018-06', 'yearly_MONTHS_2018-07', 'yearly_MONTHS_2018-08', 'yearly_MONTHS_2018-09', 'yearly_MONTHS_2018-10', 'yearly_MONTHS_2018-11', 'yearly_MONTHS_2018-12', 'yearly_MONTHS_2019-01', 'yearly_MONTHS_2019-02', 'yearly_MONTHS_2019-03', 'yearly_MONTHS_2019-04', 'yearly_MONTHS_2019-05', 'yearly_MONTHS_2019-06', 'yearly_MONTHS_2019-07', 'yearly_MONTHS_2019-08']
Now no feature has p-value greater than 0.05, so we'll consider the features in X_train3 as the final ones and lg2 as final model.
X_train3.columns
Index(['const', 'no_of_weekend_nights', 'no_of_week_nights', 'lead_time',
'avg_price_per_room', 'no_of_special_requests',
'type_of_meal_plan_Meal Plan 2', 'type_of_meal_plan_Not Selected',
'room_type_reserved_Room_Type 2', 'room_type_reserved_Room_Type 4',
'room_type_reserved_Room_Type 5', 'room_type_reserved_Room_Type 6',
'room_type_reserved_Room_Type 7', 'market_segment_type_Offline',
'market_segment_type_Online', 'yearly_MONTHS_2017-08',
'yearly_MONTHS_2017-09', 'yearly_MONTHS_2017-10',
'yearly_MONTHS_2017-11', 'yearly_MONTHS_2017-12',
'yearly_MONTHS_2018-01', 'yearly_MONTHS_2018-02',
'yearly_MONTHS_2018-03', 'yearly_MONTHS_2018-04',
'yearly_MONTHS_2018-05', 'yearly_MONTHS_2018-06',
'yearly_MONTHS_2018-07', 'yearly_MONTHS_2018-08',
'yearly_MONTHS_2018-09', 'yearly_MONTHS_2018-10',
'yearly_MONTHS_2018-11', 'yearly_MONTHS_2018-12',
'yearly_MONTHS_2019-01', 'yearly_MONTHS_2019-02',
'yearly_MONTHS_2019-03', 'yearly_MONTHS_2019-04',
'yearly_MONTHS_2019-05', 'yearly_MONTHS_2019-06',
'yearly_MONTHS_2019-07', 'yearly_MONTHS_2019-08'],
dtype='object')
The coefficients of the logistic regression model are in terms of log(odd), to find the odds we have to take the exponential of the coefficients. Therefore, odds = exp(b) The percentage change in odds is given as odds = (exp(b) - 1) * 100
# converting coefficients to odds
odds = np.exp(lg2.params)
# finding the percentage change
perc_change_odds = (np.exp(lg2.params) - 1) * 100
# removing limit from number of columns to display
pd.set_option("display.max_columns", None)
# adding the odds to a dataframe
pd.DataFrame({"Odds": odds, "Change_odd%": perc_change_odds}, index=X_train3.columns).T
| const | no_of_weekend_nights | no_of_week_nights | lead_time | avg_price_per_room | no_of_special_requests | type_of_meal_plan_Meal Plan 2 | type_of_meal_plan_Not Selected | room_type_reserved_Room_Type 2 | room_type_reserved_Room_Type 4 | room_type_reserved_Room_Type 5 | room_type_reserved_Room_Type 6 | room_type_reserved_Room_Type 7 | market_segment_type_Offline | market_segment_type_Online | yearly_MONTHS_2017-08 | yearly_MONTHS_2017-09 | yearly_MONTHS_2017-10 | yearly_MONTHS_2017-11 | yearly_MONTHS_2017-12 | yearly_MONTHS_2018-01 | yearly_MONTHS_2018-02 | yearly_MONTHS_2018-03 | yearly_MONTHS_2018-04 | yearly_MONTHS_2018-05 | yearly_MONTHS_2018-06 | yearly_MONTHS_2018-07 | yearly_MONTHS_2018-08 | yearly_MONTHS_2018-09 | yearly_MONTHS_2018-10 | yearly_MONTHS_2018-11 | yearly_MONTHS_2018-12 | yearly_MONTHS_2019-01 | yearly_MONTHS_2019-02 | yearly_MONTHS_2019-03 | yearly_MONTHS_2019-04 | yearly_MONTHS_2019-05 | yearly_MONTHS_2019-06 | yearly_MONTHS_2019-07 | yearly_MONTHS_2019-08 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Odds | 0.141906 | 1.064324 | 1.102437 | 1.017160 | 1.019047 | 0.256490 | 0.815276 | 1.409509 | 0.798860 | 0.834639 | 0.671390 | 0.602642 | 0.391153 | 0.257543 | 2.333789 | 0.159714 | 0.068826 | 0.043561 | 0.063656 | 0.024533 | 0.015429 | 0.211561 | 0.170692 | 0.119493 | 0.088229 | 0.105668 | 0.084871 | 0.096148 | 0.123110 | 0.127423 | 0.179602 | 0.044404 | 0.075828 | 0.213074 | 0.141190 | 0.111827 | 0.088060 | 0.077266 | 0.106045 | 0.105462 |
| Change_odd% | -85.809448 | 6.432424 | 10.243706 | 1.715978 | 1.904651 | -74.351016 | -18.472357 | 40.950858 | -20.114032 | -16.536099 | -32.861033 | -39.735800 | -60.884724 | -74.245686 | 133.378943 | -84.028582 | -93.117411 | -95.643949 | -93.634442 | -97.546718 | -98.457069 | -78.843871 | -82.930825 | -88.050733 | -91.177112 | -89.433203 | -91.512884 | -90.385241 | -87.689049 | -87.257694 | -82.039784 | -95.559569 | -92.417189 | -78.692558 | -85.880967 | -88.817290 | -91.193996 | -92.273369 | -89.395528 | -89.453846 |
no_of_weekend_nights: Holding all other features constant a unit change in no_of_weekend_nights will increase the odds of a customer cancelling booking by 1.06 times or a 6.4% increase in the odds.no_of_week_nights: Holding all other features constant a unit change in no_of_weekend_nights will increase the odds of a customer cancelling booking by 1.1 times or a 10.2% increase in the odds. lead_time: The odds of a customer who has a lead time in days, cancelling booking is 0.1.02 times or 1.72% more odds. avg_price_per_room: Holding all other features constant a unit change in avg_price_per_room will increase the odds of a customer cancelling booking by 1.02 times or a 1.91% increase in odds.no_of_special_requests: Holding all other features constant a unit change in no_of_special_requests will decrease the odds of a customer cancelling booking by 0.26 times or a 74.35% decrease in odds.market_segment_type_Online: Holding all other features constant a unit change in market_segment_type_Online will increase the odds of a customer cancelling booking by 2.33 times or a 133.38% increase in odds.market_segment_type_Offline: Holding all other features constant a unit change in market_segment_type_Offline will decrease the odds of a customer cancelling booking by 0.26 times or a 73.3% decrease in odds.Interpretation for other attributes can be done similarly.
# creating confusion matrix
confusion_matrix_statsmodels(lg2, X_train3, y_train1)
True Positives (TP): we correctly predicted that they will cancel the booking and they actually cancelled are 6402 or 21.50%
True Negatives (TN): we correctly predicted that they will not cancel the booking and they did not cancel are 17262 or 57.97%
False Positives (FP): we incorrectly predicted that they they will cancel the booking and they actually did not cancelled are (a "Type I error") 2364 or 7.94% Falsely predict positive Type I error
False Negatives (FN): we incorrectly predicted that they will not cancel the booking and they actually cancel (a "Type II error") 3750 or 12.59% Falsely predict negative Type II error
log_reg_model_train_perf = model_performance_classification_statsmodels(
lg2, X_train3, y_train1
)
print("Training performance:")
log_reg_model_train_perf
Training performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.794681 | 0.630615 | 0.730322 | 0.676816 |
ROC-AUC on training set
logit_roc_auc_train = roc_auc_score(y_train1, lg2.predict(X_train3))
fpr, tpr, thresholds = roc_curve(y_train1, lg2.predict(X_train3))
plt.figure(figsize=(7, 5))
plt.plot(fpr, tpr, label="Logistic Regression (area = %0.2f)" % logit_roc_auc_train)
plt.plot([0, 1], [0, 1], "r--")
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel("False Positive Rate")
plt.ylabel("True Positive Rate")
plt.title("Receiver operating characteristic")
plt.legend(loc="lower right")
plt.show()
Logistic Regression model is an ok recall and ROC-AUC score.
# Optimal threshold as per AUC-ROC curve
# The optimal cut off would be where tpr is high and fpr is low
fpr, tpr, thresholds = roc_curve(y_train1, lg2.predict(X_train3))
optimal_idx = np.argmax(tpr - fpr)
optimal_threshold_auc_roc = thresholds[optimal_idx]
print(optimal_threshold_auc_roc)
0.32290245919891175
# creating confusion matrix
confusion_matrix_statsmodels(
lg2, X_train3, y_train1, threshold=optimal_threshold_auc_roc
)
# checking model performance for this model
log_reg_model_train_perf_threshold_auc_roc = model_performance_classification_statsmodels(
lg2, X_train3, y_train1, threshold=optimal_threshold_auc_roc
)
print("Training performance:")
log_reg_model_train_perf_threshold_auc_roc
Training performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.775337 | 0.803191 | 0.63475 | 0.709105 |
logit_roc_auc_train = roc_auc_score(y_train1, lg2.predict(X_train3))
fpr, tpr, thresholds = roc_curve(y_train1, lg2.predict(X_train3))
plt.figure(figsize=(7, 5))
plt.plot(fpr, tpr, label="Logistic Regression (area = %0.2f)" % logit_roc_auc_train)
plt.plot([0, 1], [0, 1], "r--")
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel("False Positive Rate")
plt.ylabel("True Positive Rate")
plt.title("Receiver operating characteristic")
plt.legend(loc="lower right")
plt.show()
y_scores = lg2.predict(X_train3)
prec, rec, tre = precision_recall_curve(y_train1, y_scores,)
def plot_prec_recall_vs_tresh(precisions, recalls, thresholds):
plt.plot(thresholds, precisions[:-1], "b--", label="precision")
plt.plot(thresholds, recalls[:-1], "g--", label="recall")
plt.xlabel("Threshold")
plt.legend(loc="upper left")
plt.ylim([0, 1])
plt.figure(figsize=(10, 7))
plot_prec_recall_vs_tresh(prec, rec, tre)
plt.show()
# setting the threshold
optimal_threshold_curve = 0.40
# creating confusion matrix
confusion_matrix_statsmodels(lg2, X_train3, y_train1, threshold=optimal_threshold_curve)
log_reg_model_train_perf_threshold_curve = model_performance_classification_statsmodels(
lg2, X_train3, y_train1, threshold=optimal_threshold_curve
)
print("Training performance:")
log_reg_model_train_perf_threshold_curve
Training performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.789677 | 0.735422 | 0.676084 | 0.704506 |
# training performance comparison
models_train_comp_df = pd.concat(
[
log_reg_model_train_perf.T,
log_reg_model_train_perf_threshold_auc_roc.T,
log_reg_model_train_perf_threshold_curve.T,
],
axis=1,
)
models_train_comp_df.columns = [
"Logistic Regression sklearn",
"Logistic Regression-0.323 Threshold",
"Logistic Regression-0.40 Threshold",
]
print("Training performance comparison:")
models_train_comp_df
Training performance comparison:
| Logistic Regression sklearn | Logistic Regression-0.323 Threshold | Logistic Regression-0.40 Threshold | |
|---|---|---|---|
| Accuracy | 0.794681 | 0.775337 | 0.789677 |
| Recall | 0.630615 | 0.803191 | 0.735422 |
| Precision | 0.730322 | 0.634750 | 0.676084 |
| F1 | 0.676816 | 0.709105 | 0.704506 |
X_test3 = X_test1[X_train3.columns].astype(float)
confusion_matrix_statsmodels(lg2, X_test3, y_test1)
log_reg_model_test_perf = model_performance_classification_statsmodels(
lg2, X_test3, y_test1
)
print("Test performance:")
log_reg_model_test_perf
Test performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.799655 | 0.635397 | 0.737463 | 0.682636 |
logit_roc_auc_train = roc_auc_score(y_test1, lg2.predict(X_test3))
fpr, tpr, thresholds = roc_curve(y_test1, lg2.predict(X_test3))
plt.figure(figsize=(7, 5))
plt.plot(fpr, tpr, label="Logistic Regression (area = %0.2f)" % logit_roc_auc_train)
plt.plot([0, 1], [0, 1], "r--")
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel("False Positive Rate")
plt.ylabel("True Positive Rate")
plt.title("Receiver operating characteristic")
plt.legend(loc="lower right")
plt.show()
# creating confusion matrix
confusion_matrix_statsmodels(lg2, X_test3, y_test1, threshold=optimal_threshold_auc_roc)
# checking model performance for this model
log_reg_model_test_perf_threshold_auc_roc = model_performance_classification_statsmodels(
lg2, X_test3, y_test1, threshold=optimal_threshold_auc_roc
)
print("Test performance:")
log_reg_model_test_perf_threshold_auc_roc
Test performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.776542 | 0.799908 | 0.635463 | 0.708265 |
# creating confusion matrix
confusion_matrix_statsmodels(lg2, X_test3, y_test1, threshold=optimal_threshold_curve)
The confusion matrix
True Positives (TP): we correctly predicted that they will cancel the booking and they actually cancelled are 3170 or 24.84%
True Negatives (TN): we correctly predicted that they will not cancel the booking and they did not cancel are 6952 or 54.47%
False Positives (FP): we incorrectly predicted that they they will cancel the booking and they actually did not cancelled are (a "Type I error") 1483 or 11.62% Falsely predict positive Type I error
False Negatives (FN): we incorrectly predicted that they will not cancel the booking and they actually cancel (a "Type II error") 1258 or 9.07% Falsely predict negative Type II error
log_reg_model_test_perf_threshold_curve = model_performance_classification_statsmodels(
lg2, X_test3, y_test1, threshold=optimal_threshold_curve
)
print("Test performance:")
log_reg_model_test_perf_threshold_curve
Test performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.793074 | 0.73244 | 0.681281 | 0.705935 |
# training performance comparison
models_train_comp_df = pd.concat(
[
log_reg_model_train_perf.T,
log_reg_model_train_perf_threshold_auc_roc.T,
log_reg_model_train_perf_threshold_curve.T,
],
axis=1,
)
models_train_comp_df.columns = [
"Logistic Regression sklearn",
"Logistic Regression-0.323 Threshold",
"Logistic Regression-0.42 Threshold",
]
print("Training performance comparison:")
models_train_comp_df
Training performance comparison:
| Logistic Regression sklearn | Logistic Regression-0.323 Threshold | Logistic Regression-0.42 Threshold | |
|---|---|---|---|
| Accuracy | 0.794681 | 0.775337 | 0.789677 |
| Recall | 0.630615 | 0.803191 | 0.735422 |
| Precision | 0.730322 | 0.634750 | 0.676084 |
| F1 | 0.676816 | 0.709105 | 0.704506 |
# testing performance comparison
models_test_comp_df = pd.concat(
[
log_reg_model_test_perf.T,
log_reg_model_test_perf_threshold_auc_roc.T,
log_reg_model_test_perf_threshold_curve.T,
],
axis=1,
)
models_test_comp_df.columns = [
"Logistic Regression sklearn",
"Logistic Regression-0.323 Threshold",
"Logistic Regression-0.40 Threshold",
]
print("Test set performance comparison:")
models_test_comp_df
Test set performance comparison:
| Logistic Regression sklearn | Logistic Regression-0.323 Threshold | Logistic Regression-0.40 Threshold | |
|---|---|---|---|
| Accuracy | 0.799655 | 0.776542 | 0.793074 |
| Recall | 0.635397 | 0.799908 | 0.732440 |
| Precision | 0.737463 | 0.635463 | 0.681281 |
| F1 | 0.682636 | 0.708265 | 0.705935 |
We'll consider the features in X_train3 as the final ones and lg2 as final model and threshold of 0.323 as final
# creating confusion matrix
confusion_matrix_statsmodels(lg2, X_test3, y_test1, threshold=optimal_threshold_auc_roc)
True Positives (TP): we correctly predicted that they will cancel the booking and they actually cancelled are 3462 or 27.13%
True Negatives (TN): we correctly predicted that they will not cancel the booking and they did not cancel are 6249 or 50.53%
False Positives (FP): we incorrectly predicted that they they will cancel the booking and they actually did not cancelled are (a "Type I error") 1986 or 15.56% Falsely predict positive Type I error
False Negatives (FN): we incorrectly predicted that they will not cancel the booking and they actually cancel (a "Type II error") 866 or 6.79% Falsely predict negative Type II error
# checking model performance for this model
log_reg_model_test_perf_threshold_auc_roc = model_performance_classification_statsmodels(
lg2, X_test3, y_test1, threshold=optimal_threshold_auc_roc
)
print("Test performance:")
log_reg_model_test_perf_threshold_auc_roc
Test performance:
| Accuracy | Recall | Precision | F1 | |
|---|---|---|---|---|
| 0 | 0.776542 | 0.799908 | 0.635463 | 0.708265 |
We will build our model using the DecisionTreeClassifier function. Using default 'gini' criteria to split.
# Libraries to build decision tree classifier
from sklearn.tree import DecisionTreeClassifier
from sklearn import tree
model = DecisionTreeClassifier(
criterion="gini", random_state=1 #, class_weight={0: 0.34, 1: 0.66}
)
# Splitting the target from predictors
X = data.drop(["booking_status"], axis=1)
y = data["booking_status"]
X = X.drop(['no_of_children','repeated_guest',"required_car_parking_space",'no_of_previous_cancellations','no_of_previous_bookings_not_canceled'], axis=1)
X = pd.get_dummies(X, drop_first = True)
X.head()
| no_of_adults | no_of_weekend_nights | no_of_week_nights | lead_time | avg_price_per_room | no_of_special_requests | Day of the Week | type_of_meal_plan_Meal Plan 2 | type_of_meal_plan_Meal Plan 3 | type_of_meal_plan_Not Selected | room_type_reserved_Room_Type 2 | room_type_reserved_Room_Type 3 | room_type_reserved_Room_Type 4 | room_type_reserved_Room_Type 5 | room_type_reserved_Room_Type 6 | room_type_reserved_Room_Type 7 | market_segment_type_Complementary | market_segment_type_Corporate | market_segment_type_Offline | market_segment_type_Online | yearly_MONTHS_2017-08 | yearly_MONTHS_2017-09 | yearly_MONTHS_2017-10 | yearly_MONTHS_2017-11 | yearly_MONTHS_2017-12 | yearly_MONTHS_2018-01 | yearly_MONTHS_2018-02 | yearly_MONTHS_2018-03 | yearly_MONTHS_2018-04 | yearly_MONTHS_2018-05 | yearly_MONTHS_2018-06 | yearly_MONTHS_2018-07 | yearly_MONTHS_2018-08 | yearly_MONTHS_2018-09 | yearly_MONTHS_2018-10 | yearly_MONTHS_2018-11 | yearly_MONTHS_2018-12 | yearly_MONTHS_2019-01 | yearly_MONTHS_2019-02 | yearly_MONTHS_2019-03 | yearly_MONTHS_2019-04 | yearly_MONTHS_2019-05 | yearly_MONTHS_2019-06 | yearly_MONTHS_2019-07 | yearly_MONTHS_2019-08 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | 1 | 2 | 224 | 65.00 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 1 | 2 | 2 | 3 | 5 | 106.68 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2 | 2 | 2 | 1 | 1 | 60.00 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 2 | 0 | 2 | 211 | 100.00 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4 | 2 | 0 | 3 | 277 | 89.10 | 2 | 5 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 |
## Function to calculate recall score
def get_recall_score(model, predictors, target):
"""
model: classifier
predictors: independent variables
target: dependent variable
"""
prediction = model.predict(predictors)
return recall_score(target, prediction)
def confusion_matrix_sklearn(model, predictors, target):
"""
To plot the confusion_matrix with percentages
model: classifier
predictors: independent variables
target: dependent variable
"""
y_pred = model.predict(predictors)
cm = confusion_matrix(target, y_pred)
labels = np.asarray(
[
["{0:0.0f}".format(item) + "\n{0:.2%}".format(item / cm.flatten().sum())]
for item in cm.flatten()
]
).reshape(2, 2)
plt.figure(figsize=(6, 4))
sns.heatmap(cm, annot=labels, fmt="")
plt.ylabel("True label")
plt.xlabel("Predicted label")
model.fit(X_train, y_train)
DecisionTreeClassifier(random_state=1)
confusion_matrix_sklearn(model, X_train, y_train)
decision_tree_perf_train = get_recall_score(model, X_train, y_train)
print("Recall Score:", decision_tree_perf_train)
Recall Score: 0.9880225728434873
confusion_matrix_sklearn(model, X_test, y_test)
decision_tree_perf_test = get_recall_score(model, X_test, y_test)
print("Recall Score:", decision_tree_perf_test)
Recall Score: 0.7008797653958945
There is a huge disparity in performance of model on training set and test set, which suggests that the model is overfiitting
feature_names = X_train.columns.to_list()
# Text report showing the rules of a decision tree -
print(tree.export_text(model, feature_names=feature_names, show_weights=True))
|--- lead_time <= 150.50 | |--- no_of_special_requests <= 0.50 | | |--- market_segment_type_Online <= 0.50 | | | |--- lead_time <= 88.50 | | | | |--- avg_price_per_room <= 202.00 | | | | | |--- no_of_weekend_nights <= 4.50 | | | | | | |--- avg_price_per_room <= 89.05 | | | | | | | |--- market_segment_type_Corporate <= 0.50 | | | | | | | | |--- no_of_weekend_nights <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2019-08 <= 0.50 | | | | | | | | | | |--- weights: [473.00, 0.00] class: 0 | | | | | | | | | |--- yearly_MONTHS_2019-08 > 0.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- no_of_weekend_nights > 0.50 | | | | | | | | | |--- avg_price_per_room <= 64.90 | | | | | | | | | | |--- yearly_MONTHS_2018-03 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- yearly_MONTHS_2018-03 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 6 | | | | | | | | | |--- avg_price_per_room > 64.90 | | | | | | | | | | |--- no_of_weekend_nights <= 1.50 | | | | | | | | | | | |--- truncated branch of depth 14 | | | | | | | | | | |--- no_of_weekend_nights > 1.50 | | | | | | | | | | | |--- truncated branch of depth 13 | | | | | | | |--- market_segment_type_Corporate > 0.50 | | | | | | | | |--- lead_time <= 44.50 | | | | | | | | | |--- yearly_MONTHS_2017-08 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-02 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 14 | | | | | | | | | | |--- yearly_MONTHS_2018-02 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 7 | | | | | | | | | |--- yearly_MONTHS_2017-08 > 0.50 | | | | | | | | | | |--- no_of_week_nights <= 1.50 | | | | | | | | | | | |--- weights: [7.00, 0.00] class: 0 | | | | | | | | | | |--- no_of_week_nights > 1.50 | | | | | | | | | | | |--- truncated branch of depth 7 | | | | | | | | |--- lead_time > 44.50 | | | | | | | | | |--- avg_price_per_room <= 73.00 | | | | | | | | | | |--- lead_time <= 48.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- lead_time > 48.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | |--- avg_price_per_room > 73.00 | | | | | | | | | | |--- no_of_weekend_nights <= 1.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- no_of_weekend_nights > 1.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | |--- avg_price_per_room > 89.05 | | | | | | | |--- market_segment_type_Corporate <= 0.50 | | | | | | | | |--- no_of_weekend_nights <= 0.50 | | | | | | | | | |--- room_type_reserved_Room_Type 4 <= 0.50 | | | | | | | | | | |--- lead_time <= 1.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- lead_time > 1.50 | | | | | | | | | | | |--- weights: [196.00, 0.00] class: 0 | | | | | | | | | |--- room_type_reserved_Room_Type 4 > 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-11 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- yearly_MONTHS_2018-11 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | |--- no_of_weekend_nights > 0.50 | | | | | | | | | |--- no_of_week_nights <= 0.50 | | | | | | | | | | |--- lead_time <= 74.50 | | | | | | | | | | | |--- truncated branch of depth 8 | | | | | | | | | | |--- lead_time > 74.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | |--- no_of_week_nights > 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-03 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 16 | | | | | | | | | | |--- yearly_MONTHS_2018-03 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | |--- market_segment_type_Corporate > 0.50 | | | | | | | | |--- yearly_MONTHS_2017-08 <= 0.50 | | | | | | | | | |--- lead_time <= 86.50 | | | | | | | | | | |--- lead_time <= 1.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- lead_time > 1.50 | | | | | | | | | | | |--- truncated branch of depth 18 | | | | | | | | | |--- lead_time > 86.50 | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | |--- yearly_MONTHS_2017-08 > 0.50 | | | | | | | | | |--- avg_price_per_room <= 94.50 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 94.50 | | | | | | | | | | |--- weights: [0.00, 4.00] class: 1 | | | | | |--- no_of_weekend_nights > 4.50 | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | |--- avg_price_per_room > 202.00 | | | | | |--- yearly_MONTHS_2017-12 <= 0.50 | | | | | | |--- weights: [0.00, 4.00] class: 1 | | | | | |--- yearly_MONTHS_2017-12 > 0.50 | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | |--- lead_time > 88.50 | | | | |--- avg_price_per_room <= 93.12 | | | | | |--- no_of_week_nights <= 8.00 | | | | | | |--- yearly_MONTHS_2018-02 <= 0.50 | | | | | | | |--- lead_time <= 135.50 | | | | | | | | |--- avg_price_per_room <= 63.67 | | | | | | | | | |--- lead_time <= 115.50 | | | | | | | | | | |--- weights: [34.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 115.50 | | | | | | | | | | |--- lead_time <= 116.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | | |--- lead_time > 116.50 | | | | | | | | | | | |--- weights: [7.00, 0.00] class: 0 | | | | | | | | |--- avg_price_per_room > 63.67 | | | | | | | | | |--- avg_price_per_room <= 64.38 | | | | | | | | | | |--- lead_time <= 114.00 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- lead_time > 114.00 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 64.38 | | | | | | | | | | |--- lead_time <= 132.50 | | | | | | | | | | | |--- truncated branch of depth 14 | | | | | | | | | | |--- lead_time > 132.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | |--- lead_time > 135.50 | | | | | | | | |--- yearly_MONTHS_2018-11 <= 0.50 | | | | | | | | | |--- avg_price_per_room <= 64.38 | | | | | | | | | | |--- lead_time <= 147.50 | | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 147.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 64.38 | | | | | | | | | | |--- weights: [61.00, 0.00] class: 0 | | | | | | | | |--- yearly_MONTHS_2018-11 > 0.50 | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | |--- yearly_MONTHS_2018-02 > 0.50 | | | | | | | |--- avg_price_per_room <= 73.50 | | | | | | | | |--- lead_time <= 104.50 | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | |--- lead_time > 104.50 | | | | | | | | | |--- room_type_reserved_Room_Type 4 <= 0.50 | | | | | | | | | | |--- no_of_week_nights <= 0.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | | |--- no_of_week_nights > 0.50 | | | | | | | | | | | |--- weights: [0.00, 7.00] class: 1 | | | | | | | | | |--- room_type_reserved_Room_Type 4 > 0.50 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | |--- avg_price_per_room > 73.50 | | | | | | | | |--- lead_time <= 107.00 | | | | | | | | | |--- avg_price_per_room <= 80.50 | | | | | | | | | | |--- weights: [1.00, 1.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 80.50 | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | |--- lead_time > 107.00 | | | | | | | | | |--- weights: [10.00, 0.00] class: 0 | | | | | |--- no_of_week_nights > 8.00 | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | |--- avg_price_per_room > 93.12 | | | | | |--- no_of_week_nights <= 1.50 | | | | | | |--- yearly_MONTHS_2018-08 <= 0.50 | | | | | | | |--- avg_price_per_room <= 95.25 | | | | | | | | |--- lead_time <= 127.50 | | | | | | | | | |--- lead_time <= 99.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- lead_time > 99.50 | | | | | | | | | | |--- yearly_MONTHS_2017-08 <= 0.50 | | | | | | | | | | | |--- weights: [1.00, 1.00] class: 0 | | | | | | | | | | |--- yearly_MONTHS_2017-08 > 0.50 | | | | | | | | | | | |--- weights: [1.00, 1.00] class: 0 | | | | | | | | |--- lead_time > 127.50 | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | | |--- avg_price_per_room > 95.25 | | | | | | | | |--- avg_price_per_room <= 112.62 | | | | | | | | | |--- yearly_MONTHS_2018-03 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-07 <= 0.50 | | | | | | | | | | | |--- weights: [11.00, 0.00] class: 0 | | | | | | | | | | |--- yearly_MONTHS_2018-07 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | |--- yearly_MONTHS_2018-03 > 0.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- avg_price_per_room > 112.62 | | | | | | | | | |--- room_type_reserved_Room_Type 4 <= 0.50 | | | | | | | | | | |--- lead_time <= 99.50 | | | | | | | | | | | |--- weights: [0.00, 4.00] class: 1 | | | | | | | | | | |--- lead_time > 99.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | |--- room_type_reserved_Room_Type 4 > 0.50 | | | | | | | | | | |--- weights: [3.00, 0.00] class: 0 | | | | | | |--- yearly_MONTHS_2018-08 > 0.50 | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | |--- no_of_week_nights > 1.50 | | | | | | |--- no_of_weekend_nights <= 0.50 | | | | | | | |--- avg_price_per_room <= 100.50 | | | | | | | | |--- lead_time <= 102.00 | | | | | | | | | |--- weights: [0.00, 5.00] class: 1 | | | | | | | | |--- lead_time > 102.00 | | | | | | | | | |--- avg_price_per_room <= 98.55 | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 98.55 | | | | | | | | | | |--- lead_time <= 124.00 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- lead_time > 124.00 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | |--- avg_price_per_room > 100.50 | | | | | | | | |--- avg_price_per_room <= 117.50 | | | | | | | | | |--- weights: [15.00, 0.00] class: 0 | | | | | | | | |--- avg_price_per_room > 117.50 | | | | | | | | | |--- lead_time <= 135.00 | | | | | | | | | | |--- lead_time <= 112.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- lead_time > 112.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | |--- lead_time > 135.00 | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | |--- no_of_weekend_nights > 0.50 | | | | | | | |--- yearly_MONTHS_2017-10 <= 0.50 | | | | | | | | |--- yearly_MONTHS_2019-05 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2019-04 <= 0.50 | | | | | | | | | | |--- Day of the Week <= 2.50 | | | | | | | | | | | |--- weights: [40.00, 0.00] class: 0 | | | | | | | | | | |--- Day of the Week > 2.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | |--- yearly_MONTHS_2019-04 > 0.50 | | | | | | | | | | |--- Day of the Week <= 0.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | | |--- Day of the Week > 0.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | |--- yearly_MONTHS_2019-05 > 0.50 | | | | | | | | | |--- no_of_week_nights <= 2.50 | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | |--- no_of_week_nights > 2.50 | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | |--- yearly_MONTHS_2017-10 > 0.50 | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | |--- market_segment_type_Online > 0.50 | | | |--- lead_time <= 9.50 | | | | |--- avg_price_per_room <= 202.67 | | | | | |--- lead_time <= 3.50 | | | | | | |--- no_of_week_nights <= 7.50 | | | | | | | |--- yearly_MONTHS_2018-04 <= 0.50 | | | | | | | | |--- yearly_MONTHS_2018-02 <= 0.50 | | | | | | | | | |--- avg_price_per_room <= 88.78 | | | | | | | | | | |--- yearly_MONTHS_2017-08 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 6 | | | | | | | | | | |--- yearly_MONTHS_2017-08 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | |--- avg_price_per_room > 88.78 | | | | | | | | | | |--- avg_price_per_room <= 89.25 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- avg_price_per_room > 89.25 | | | | | | | | | | | |--- truncated branch of depth 14 | | | | | | | | |--- yearly_MONTHS_2018-02 > 0.50 | | | | | | | | | |--- no_of_week_nights <= 3.50 | | | | | | | | | | |--- no_of_weekend_nights <= 1.50 | | | | | | | | | | | |--- truncated branch of depth 10 | | | | | | | | | | |--- no_of_weekend_nights > 1.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | |--- no_of_week_nights > 3.50 | | | | | | | | | | |--- Day of the Week <= 5.00 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | | |--- Day of the Week > 5.00 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | |--- yearly_MONTHS_2018-04 > 0.50 | | | | | | | | |--- no_of_weekend_nights <= 0.50 | | | | | | | | | |--- lead_time <= 2.50 | | | | | | | | | | |--- lead_time <= 0.50 | | | | | | | | | | | |--- weights: [9.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | |--- lead_time > 2.50 | | | | | | | | | | |--- Day of the Week <= 5.50 | | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | | | | |--- Day of the Week > 5.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | |--- no_of_weekend_nights > 0.50 | | | | | | | | | |--- room_type_reserved_Room_Type 4 <= 0.50 | | | | | | | | | | |--- Day of the Week <= 0.50 | | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | | | | |--- Day of the Week > 0.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | |--- room_type_reserved_Room_Type 4 > 0.50 | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | |--- no_of_week_nights > 7.50 | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | |--- lead_time > 3.50 | | | | | | |--- avg_price_per_room <= 120.50 | | | | | | | |--- yearly_MONTHS_2018-03 <= 0.50 | | | | | | | | |--- yearly_MONTHS_2018-02 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2018-04 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-11 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 12 | | | | | | | | | | |--- yearly_MONTHS_2018-11 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | |--- yearly_MONTHS_2018-04 > 0.50 | | | | | | | | | | |--- lead_time <= 8.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- lead_time > 8.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | |--- yearly_MONTHS_2018-02 > 0.50 | | | | | | | | | |--- no_of_weekend_nights <= 1.50 | | | | | | | | | | |--- no_of_week_nights <= 1.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- no_of_week_nights > 1.50 | | | | | | | | | | | |--- truncated branch of depth 9 | | | | | | | | | |--- no_of_weekend_nights > 1.50 | | | | | | | | | | |--- weights: [0.00, 9.00] class: 1 | | | | | | | |--- yearly_MONTHS_2018-03 > 0.50 | | | | | | | | |--- avg_price_per_room <= 100.50 | | | | | | | | | |--- lead_time <= 6.50 | | | | | | | | | | |--- Day of the Week <= 5.00 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | | |--- Day of the Week > 5.00 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 6.50 | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | |--- avg_price_per_room > 100.50 | | | | | | | | | |--- weights: [0.00, 7.00] class: 1 | | | | | | |--- avg_price_per_room > 120.50 | | | | | | | |--- avg_price_per_room <= 127.90 | | | | | | | | |--- room_type_reserved_Room_Type 4 <= 0.50 | | | | | | | | | |--- weights: [0.00, 9.00] class: 1 | | | | | | | | |--- room_type_reserved_Room_Type 4 > 0.50 | | | | | | | | | |--- lead_time <= 5.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- lead_time > 5.50 | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | |--- avg_price_per_room > 127.90 | | | | | | | | |--- yearly_MONTHS_2019-06 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | | | | | |--- lead_time <= 6.50 | | | | | | | | | | | |--- truncated branch of depth 10 | | | | | | | | | | |--- lead_time > 6.50 | | | | | | | | | | | |--- truncated branch of depth 14 | | | | | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | | | | | |--- weights: [8.00, 0.00] class: 0 | | | | | | | | |--- yearly_MONTHS_2019-06 > 0.50 | | | | | | | | | |--- weights: [10.00, 0.00] class: 0 | | | | |--- avg_price_per_room > 202.67 | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | |--- weights: [0.00, 46.00] class: 1 | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | |--- lead_time > 9.50 | | | | |--- avg_price_per_room <= 111.79 | | | | | |--- lead_time <= 25.50 | | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | | |--- avg_price_per_room <= 70.81 | | | | | | | | |--- yearly_MONTHS_2018-02 <= 0.50 | | | | | | | | | |--- weights: [53.00, 0.00] class: 0 | | | | | | | | |--- yearly_MONTHS_2018-02 > 0.50 | | | | | | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | | | | | | |--- Day of the Week <= 2.00 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- Day of the Week > 2.00 | | | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | | |--- avg_price_per_room > 70.81 | | | | | | | | |--- yearly_MONTHS_2019-01 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2018-01 <= 0.50 | | | | | | | | | | |--- lead_time <= 19.50 | | | | | | | | | | | |--- truncated branch of depth 17 | | | | | | | | | | |--- lead_time > 19.50 | | | | | | | | | | | |--- truncated branch of depth 11 | | | | | | | | | |--- yearly_MONTHS_2018-01 > 0.50 | | | | | | | | | | |--- lead_time <= 24.50 | | | | | | | | | | | |--- weights: [24.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 24.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- yearly_MONTHS_2019-01 > 0.50 | | | | | | | | | |--- weights: [34.00, 0.00] class: 0 | | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | | |--- weights: [56.00, 0.00] class: 0 | | | | | |--- lead_time > 25.50 | | | | | | |--- avg_price_per_room <= 59.43 | | | | | | | |--- lead_time <= 69.00 | | | | | | | | |--- yearly_MONTHS_2019-03 <= 0.50 | | | | | | | | | |--- weights: [28.00, 0.00] class: 0 | | | | | | | | |--- yearly_MONTHS_2019-03 > 0.50 | | | | | | | | | |--- lead_time <= 54.50 | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 54.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | |--- lead_time > 69.00 | | | | | | | | |--- no_of_week_nights <= 6.00 | | | | | | | | | |--- avg_price_per_room <= 24.85 | | | | | | | | | | |--- weights: [6.00, 0.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 24.85 | | | | | | | | | | |--- avg_price_per_room <= 50.72 | | | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | | | | | |--- avg_price_per_room > 50.72 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | |--- no_of_week_nights > 6.00 | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | |--- avg_price_per_room > 59.43 | | | | | | | |--- yearly_MONTHS_2017-10 <= 0.50 | | | | | | | | |--- no_of_week_nights <= 5.50 | | | | | | | | | |--- yearly_MONTHS_2018-10 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 33 | | | | | | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 9 | | | | | | | | | |--- yearly_MONTHS_2018-10 > 0.50 | | | | | | | | | | |--- no_of_weekend_nights <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- no_of_weekend_nights > 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | |--- no_of_week_nights > 5.50 | | | | | | | | | |--- avg_price_per_room <= 63.58 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 63.58 | | | | | | | | | | |--- yearly_MONTHS_2017-12 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- yearly_MONTHS_2017-12 > 0.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | |--- yearly_MONTHS_2017-10 > 0.50 | | | | | | | | |--- lead_time <= 95.00 | | | | | | | | | |--- lead_time <= 46.50 | | | | | | | | | | |--- weights: [17.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 46.50 | | | | | | | | | | |--- room_type_reserved_Room_Type 2 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- room_type_reserved_Room_Type 2 > 0.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- lead_time > 95.00 | | | | | | | | | |--- lead_time <= 141.00 | | | | | | | | | | |--- weights: [0.00, 5.00] class: 1 | | | | | | | | | |--- lead_time > 141.00 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | |--- avg_price_per_room > 111.79 | | | | | |--- avg_price_per_room <= 201.51 | | | | | | |--- yearly_MONTHS_2019-06 <= 0.50 | | | | | | | |--- yearly_MONTHS_2017-12 <= 0.50 | | | | | | | | |--- yearly_MONTHS_2018-06 <= 0.50 | | | | | | | | | |--- lead_time <= 20.50 | | | | | | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 21 | | | | | | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | | | | | | |--- weights: [12.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 20.50 | | | | | | | | | | |--- Day of the Week <= 5.50 | | | | | | | | | | | |--- truncated branch of depth 35 | | | | | | | | | | |--- Day of the Week > 5.50 | | | | | | | | | | | |--- truncated branch of depth 21 | | | | | | | | |--- yearly_MONTHS_2018-06 > 0.50 | | | | | | | | | |--- lead_time <= 65.50 | | | | | | | | | | |--- avg_price_per_room <= 149.85 | | | | | | | | | | | |--- truncated branch of depth 10 | | | | | | | | | | |--- avg_price_per_room > 149.85 | | | | | | | | | | | |--- weights: [0.00, 25.00] class: 1 | | | | | | | | | |--- lead_time > 65.50 | | | | | | | | | | |--- no_of_week_nights <= 0.50 | | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | | | |--- no_of_week_nights > 0.50 | | | | | | | | | | | |--- truncated branch of depth 10 | | | | | | | |--- yearly_MONTHS_2017-12 > 0.50 | | | | | | | | |--- Day of the Week <= 5.50 | | | | | | | | | |--- weights: [8.00, 0.00] class: 0 | | | | | | | | |--- Day of the Week > 5.50 | | | | | | | | | |--- avg_price_per_room <= 138.17 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 138.17 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | |--- yearly_MONTHS_2019-06 > 0.50 | | | | | | | |--- Day of the Week <= 2.50 | | | | | | | | |--- avg_price_per_room <= 152.10 | | | | | | | | | |--- lead_time <= 126.00 | | | | | | | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 6 | | | | | | | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | | | | | | | |--- truncated branch of depth 7 | | | | | | | | | |--- lead_time > 126.00 | | | | | | | | | | |--- Day of the Week <= 1.00 | | | | | | | | | | | |--- weights: [0.00, 5.00] class: 1 | | | | | | | | | | |--- Day of the Week > 1.00 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | |--- avg_price_per_room > 152.10 | | | | | | | | | |--- avg_price_per_room <= 185.92 | | | | | | | | | | |--- avg_price_per_room <= 169.82 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- avg_price_per_room > 169.82 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | |--- avg_price_per_room > 185.92 | | | | | | | | | | |--- Day of the Week <= 1.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- Day of the Week > 1.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | |--- Day of the Week > 2.50 | | | | | | | | |--- no_of_week_nights <= 4.50 | | | | | | | | | |--- type_of_meal_plan_Meal Plan 2 <= 0.50 | | | | | | | | | | |--- lead_time <= 134.50 | | | | | | | | | | | |--- truncated branch of depth 13 | | | | | | | | | | |--- lead_time > 134.50 | | | | | | | | | | | |--- truncated branch of depth 7 | | | | | | | | | |--- type_of_meal_plan_Meal Plan 2 > 0.50 | | | | | | | | | | |--- avg_price_per_room <= 183.38 | | | | | | | | | | | |--- weights: [6.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 183.38 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | |--- no_of_week_nights > 4.50 | | | | | | | | | |--- weights: [5.00, 0.00] class: 0 | | | | | |--- avg_price_per_room > 201.51 | | | | | | |--- yearly_MONTHS_2019-01 <= 0.50 | | | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | | | |--- weights: [0.00, 219.00] class: 1 | | | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | |--- yearly_MONTHS_2019-01 > 0.50 | | | | | | | |--- weights: [1.00, 0.00] class: 0 | |--- no_of_special_requests > 0.50 | | |--- no_of_special_requests <= 1.50 | | | |--- lead_time <= 7.50 | | | | |--- no_of_week_nights <= 8.00 | | | | | |--- avg_price_per_room <= 123.60 | | | | | | |--- yearly_MONTHS_2018-08 <= 0.50 | | | | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | | | | |--- Day of the Week <= 4.50 | | | | | | | | | |--- yearly_MONTHS_2018-11 <= 0.50 | | | | | | | | | | |--- weights: [459.00, 0.00] class: 0 | | | | | | | | | |--- yearly_MONTHS_2018-11 > 0.50 | | | | | | | | | | |--- avg_price_per_room <= 107.00 | | | | | | | | | | | |--- weights: [23.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 107.00 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | |--- Day of the Week > 4.50 | | | | | | | | | |--- room_type_reserved_Room_Type 2 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2019-06 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- yearly_MONTHS_2019-06 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | |--- room_type_reserved_Room_Type 2 > 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-03 <= 0.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | | |--- yearly_MONTHS_2018-03 > 0.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | | | | |--- yearly_MONTHS_2018-03 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2018-11 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-02 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 7 | | | | | | | | | | |--- yearly_MONTHS_2018-02 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 6 | | | | | | | | | |--- yearly_MONTHS_2018-11 > 0.50 | | | | | | | | | | |--- lead_time <= 6.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | | |--- lead_time > 6.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- yearly_MONTHS_2018-03 > 0.50 | | | | | | | | | |--- lead_time <= 6.50 | | | | | | | | | | |--- lead_time <= 4.50 | | | | | | | | | | | |--- weights: [9.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 4.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | |--- lead_time > 6.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | |--- yearly_MONTHS_2018-08 > 0.50 | | | | | | | |--- no_of_week_nights <= 3.50 | | | | | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | | | | | |--- weights: [11.00, 0.00] class: 0 | | | | | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | | | | | |--- avg_price_per_room <= 97.38 | | | | | | | | | | |--- weights: [5.00, 0.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 97.38 | | | | | | | | | | |--- lead_time <= 4.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- lead_time > 4.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | |--- no_of_week_nights > 3.50 | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | |--- avg_price_per_room > 123.60 | | | | | | |--- lead_time <= 4.50 | | | | | | | |--- avg_price_per_room <= 241.00 | | | | | | | | |--- type_of_meal_plan_Meal Plan 2 <= 0.50 | | | | | | | | | |--- avg_price_per_room <= 133.37 | | | | | | | | | | |--- no_of_week_nights <= 4.50 | | | | | | | | | | | |--- truncated branch of depth 8 | | | | | | | | | | |--- no_of_week_nights > 4.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 133.37 | | | | | | | | | | |--- yearly_MONTHS_2019-05 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 9 | | | | | | | | | | |--- yearly_MONTHS_2019-05 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | |--- type_of_meal_plan_Meal Plan 2 > 0.50 | | | | | | | | | |--- no_of_week_nights <= 2.50 | | | | | | | | | | |--- weights: [9.00, 0.00] class: 0 | | | | | | | | | |--- no_of_week_nights > 2.50 | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | |--- avg_price_per_room > 241.00 | | | | | | | | |--- avg_price_per_room <= 243.40 | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- avg_price_per_room > 243.40 | | | | | | | | | |--- yearly_MONTHS_2018-08 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2019-05 <= 0.50 | | | | | | | | | | | |--- weights: [8.00, 0.00] class: 0 | | | | | | | | | | |--- yearly_MONTHS_2019-05 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | |--- yearly_MONTHS_2018-08 > 0.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | |--- lead_time > 4.50 | | | | | | | |--- yearly_MONTHS_2017-08 <= 0.50 | | | | | | | | |--- avg_price_per_room <= 124.33 | | | | | | | | | |--- yearly_MONTHS_2017-09 <= 0.50 | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | | |--- yearly_MONTHS_2017-09 > 0.50 | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | |--- avg_price_per_room > 124.33 | | | | | | | | | |--- yearly_MONTHS_2019-04 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-09 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 10 | | | | | | | | | | |--- yearly_MONTHS_2018-09 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | |--- yearly_MONTHS_2019-04 > 0.50 | | | | | | | | | | |--- no_of_weekend_nights <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- no_of_weekend_nights > 0.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | |--- yearly_MONTHS_2017-08 > 0.50 | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | |--- no_of_week_nights > 8.00 | | | | | |--- room_type_reserved_Room_Type 4 <= 0.50 | | | | | | |--- yearly_MONTHS_2018-01 <= 0.50 | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | |--- yearly_MONTHS_2018-01 > 0.50 | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | |--- room_type_reserved_Room_Type 4 > 0.50 | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | |--- lead_time > 7.50 | | | | |--- avg_price_per_room <= 107.91 | | | | | |--- market_segment_type_Online <= 0.50 | | | | | | |--- lead_time <= 94.50 | | | | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | | | | |--- weights: [376.00, 0.00] class: 0 | | | | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | | | | |--- avg_price_per_room <= 99.00 | | | | | | | | | |--- weights: [9.00, 0.00] class: 0 | | | | | | | | |--- avg_price_per_room > 99.00 | | | | | | | | | |--- lead_time <= 42.50 | | | | | | | | | | |--- weights: [1.00, 1.00] class: 0 | | | | | | | | | |--- lead_time > 42.50 | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | |--- lead_time > 94.50 | | | | | | | |--- no_of_weekend_nights <= 3.00 | | | | | | | | |--- no_of_week_nights <= 2.50 | | | | | | | | | |--- lead_time <= 108.00 | | | | | | | | | | |--- lead_time <= 106.50 | | | | | | | | | | | |--- truncated branch of depth 6 | | | | | | | | | | |--- lead_time > 106.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | |--- lead_time > 108.00 | | | | | | | | | | |--- yearly_MONTHS_2018-09 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- yearly_MONTHS_2018-09 > 0.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- no_of_week_nights > 2.50 | | | | | | | | | |--- yearly_MONTHS_2017-08 <= 0.50 | | | | | | | | | | |--- type_of_meal_plan_Meal Plan 2 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | | |--- type_of_meal_plan_Meal Plan 2 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | |--- yearly_MONTHS_2017-08 > 0.50 | | | | | | | | | | |--- lead_time <= 115.00 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | | |--- lead_time > 115.00 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | |--- no_of_weekend_nights > 3.00 | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | |--- market_segment_type_Online > 0.50 | | | | | | |--- no_of_weekend_nights <= 2.50 | | | | | | | |--- lead_time <= 69.50 | | | | | | | | |--- yearly_MONTHS_2019-02 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2018-11 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2017-08 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 20 | | | | | | | | | | |--- yearly_MONTHS_2017-08 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 6 | | | | | | | | | |--- yearly_MONTHS_2018-11 > 0.50 | | | | | | | | | | |--- avg_price_per_room <= 74.71 | | | | | | | | | | | |--- weights: [31.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 74.71 | | | | | | | | | | | |--- truncated branch of depth 15 | | | | | | | | |--- yearly_MONTHS_2019-02 > 0.50 | | | | | | | | | |--- avg_price_per_room <= 69.60 | | | | | | | | | | |--- weights: [22.00, 0.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 69.60 | | | | | | | | | | |--- lead_time <= 28.50 | | | | | | | | | | | |--- truncated branch of depth 9 | | | | | | | | | | |--- lead_time > 28.50 | | | | | | | | | | | |--- truncated branch of depth 9 | | | | | | | |--- lead_time > 69.50 | | | | | | | | |--- yearly_MONTHS_2019-02 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2018-07 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-06 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 22 | | | | | | | | | | |--- yearly_MONTHS_2018-06 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 7 | | | | | | | | | |--- yearly_MONTHS_2018-07 > 0.50 | | | | | | | | | | |--- Day of the Week <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- Day of the Week > 0.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | |--- yearly_MONTHS_2019-02 > 0.50 | | | | | | | | | |--- Day of the Week <= 1.50 | | | | | | | | | | |--- lead_time <= 71.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | | |--- lead_time > 71.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | |--- Day of the Week > 1.50 | | | | | | | | | | |--- no_of_weekend_nights <= 1.50 | | | | | | | | | | | |--- truncated branch of depth 9 | | | | | | | | | | |--- no_of_weekend_nights > 1.50 | | | | | | | | | | | |--- weights: [0.00, 6.00] class: 1 | | | | | | |--- no_of_weekend_nights > 2.50 | | | | | | | |--- yearly_MONTHS_2019-01 <= 0.50 | | | | | | | | |--- lead_time <= 108.00 | | | | | | | | | |--- yearly_MONTHS_2018-01 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2019-03 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | | |--- yearly_MONTHS_2019-03 > 0.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- yearly_MONTHS_2018-01 > 0.50 | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | |--- lead_time > 108.00 | | | | | | | | | |--- avg_price_per_room <= 77.79 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 77.79 | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | |--- yearly_MONTHS_2019-01 > 0.50 | | | | | | | | |--- lead_time <= 46.00 | | | | | | | | | |--- weights: [5.00, 0.00] class: 0 | | | | | | | | |--- lead_time > 46.00 | | | | | | | | | |--- avg_price_per_room <= 90.10 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 90.10 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | |--- avg_price_per_room > 107.91 | | | | | |--- yearly_MONTHS_2019-08 <= 0.50 | | | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | | | |--- avg_price_per_room <= 147.95 | | | | | | | | |--- no_of_weekend_nights <= 3.50 | | | | | | | | | |--- lead_time <= 30.50 | | | | | | | | | | |--- avg_price_per_room <= 135.55 | | | | | | | | | | | |--- truncated branch of depth 15 | | | | | | | | | | |--- avg_price_per_room > 135.55 | | | | | | | | | | | |--- truncated branch of depth 11 | | | | | | | | | |--- lead_time > 30.50 | | | | | | | | | | |--- yearly_MONTHS_2018-10 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 24 | | | | | | | | | | |--- yearly_MONTHS_2018-10 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 14 | | | | | | | | |--- no_of_weekend_nights > 3.50 | | | | | | | | | |--- avg_price_per_room <= 145.78 | | | | | | | | | | |--- avg_price_per_room <= 127.71 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | | |--- avg_price_per_room > 127.71 | | | | | | | | | | | |--- weights: [0.00, 6.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 145.78 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | |--- avg_price_per_room > 147.95 | | | | | | | | |--- lead_time <= 87.50 | | | | | | | | | |--- no_of_week_nights <= 2.50 | | | | | | | | | | |--- yearly_MONTHS_2017-08 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 22 | | | | | | | | | | |--- yearly_MONTHS_2017-08 > 0.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | |--- no_of_week_nights > 2.50 | | | | | | | | | | |--- avg_price_per_room <= 199.40 | | | | | | | | | | | |--- truncated branch of depth 23 | | | | | | | | | | |--- avg_price_per_room > 199.40 | | | | | | | | | | | |--- truncated branch of depth 12 | | | | | | | | |--- lead_time > 87.50 | | | | | | | | | |--- yearly_MONTHS_2018-08 <= 0.50 | | | | | | | | | | |--- avg_price_per_room <= 201.15 | | | | | | | | | | | |--- truncated branch of depth 13 | | | | | | | | | | |--- avg_price_per_room > 201.15 | | | | | | | | | | | |--- weights: [0.00, 5.00] class: 1 | | | | | | | | | |--- yearly_MONTHS_2018-08 > 0.50 | | | | | | | | | | |--- avg_price_per_room <= 234.14 | | | | | | | | | | | |--- weights: [19.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 234.14 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | | | |--- no_of_weekend_nights <= 1.50 | | | | | | | | |--- lead_time <= 122.00 | | | | | | | | | |--- avg_price_per_room <= 171.00 | | | | | | | | | | |--- lead_time <= 10.50 | | | | | | | | | | | |--- truncated branch of depth 7 | | | | | | | | | | |--- lead_time > 10.50 | | | | | | | | | | | |--- truncated branch of depth 20 | | | | | | | | | |--- avg_price_per_room > 171.00 | | | | | | | | | | |--- lead_time <= 11.00 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 11.00 | | | | | | | | | | | |--- weights: [0.00, 4.00] class: 1 | | | | | | | | |--- lead_time > 122.00 | | | | | | | | | |--- avg_price_per_room <= 114.05 | | | | | | | | | | |--- lead_time <= 149.00 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | | |--- lead_time > 149.00 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 114.05 | | | | | | | | | | |--- yearly_MONTHS_2018-09 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- yearly_MONTHS_2018-09 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | |--- no_of_weekend_nights > 1.50 | | | | | | | | |--- avg_price_per_room <= 125.55 | | | | | | | | | |--- yearly_MONTHS_2018-10 <= 0.50 | | | | | | | | | | |--- no_of_week_nights <= 6.50 | | | | | | | | | | | |--- truncated branch of depth 14 | | | | | | | | | | |--- no_of_week_nights > 6.50 | | | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | | | | |--- yearly_MONTHS_2018-10 > 0.50 | | | | | | | | | | |--- lead_time <= 15.00 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 15.00 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | |--- avg_price_per_room > 125.55 | | | | | | | | | |--- yearly_MONTHS_2018-09 <= 0.50 | | | | | | | | | | |--- no_of_week_nights <= 3.50 | | | | | | | | | | | |--- truncated branch of depth 12 | | | | | | | | | | |--- no_of_week_nights > 3.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | |--- yearly_MONTHS_2018-09 > 0.50 | | | | | | | | | | |--- weights: [3.00, 0.00] class: 0 | | | | | |--- yearly_MONTHS_2019-08 > 0.50 | | | | | | |--- avg_price_per_room <= 124.50 | | | | | | | |--- no_of_week_nights <= 0.50 | | | | | | | | |--- lead_time <= 23.00 | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | |--- lead_time > 23.00 | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | |--- no_of_week_nights > 0.50 | | | | | | | | |--- no_of_week_nights <= 5.50 | | | | | | | | | |--- avg_price_per_room <= 109.55 | | | | | | | | | | |--- avg_price_per_room <= 109.42 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | | |--- avg_price_per_room > 109.42 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 109.55 | | | | | | | | | | |--- lead_time <= 102.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- lead_time > 102.50 | | | | | | | | | | | |--- weights: [12.00, 0.00] class: 0 | | | | | | | | |--- no_of_week_nights > 5.50 | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | |--- avg_price_per_room > 124.50 | | | | | | | |--- avg_price_per_room <= 195.17 | | | | | | | | |--- lead_time <= 140.50 | | | | | | | | | |--- lead_time <= 11.50 | | | | | | | | | | |--- avg_price_per_room <= 131.17 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | | |--- avg_price_per_room > 131.17 | | | | | | | | | | | |--- weights: [7.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 11.50 | | | | | | | | | | |--- no_of_week_nights <= 2.50 | | | | | | | | | | | |--- truncated branch of depth 13 | | | | | | | | | | |--- no_of_week_nights > 2.50 | | | | | | | | | | | |--- truncated branch of depth 15 | | | | | | | | |--- lead_time > 140.50 | | | | | | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | | | | | | |--- no_of_weekend_nights <= 2.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- no_of_weekend_nights > 2.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | |--- avg_price_per_room > 195.17 | | | | | | | | |--- lead_time <= 96.00 | | | | | | | | | |--- Day of the Week <= 1.50 | | | | | | | | | | |--- avg_price_per_room <= 236.29 | | | | | | | | | | | |--- weights: [0.00, 8.00] class: 1 | | | | | | | | | | |--- avg_price_per_room > 236.29 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | |--- Day of the Week > 1.50 | | | | | | | | | | |--- lead_time <= 22.00 | | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 22.00 | | | | | | | | | | | |--- truncated branch of depth 6 | | | | | | | | |--- lead_time > 96.00 | | | | | | | | | |--- weights: [0.00, 9.00] class: 1 | | |--- no_of_special_requests > 1.50 | | | |--- lead_time <= 91.50 | | | | |--- no_of_week_nights <= 3.50 | | | | | |--- lead_time <= 90.50 | | | | | | |--- weights: [2505.00, 0.00] class: 0 | | | | | |--- lead_time > 90.50 | | | | | | |--- yearly_MONTHS_2019-08 <= 0.50 | | | | | | | |--- weights: [16.00, 0.00] class: 0 | | | | | | |--- yearly_MONTHS_2019-08 > 0.50 | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | |--- no_of_week_nights > 3.50 | | | | | |--- no_of_weekend_nights <= 4.50 | | | | | | |--- no_of_special_requests <= 2.50 | | | | | | | |--- lead_time <= 25.50 | | | | | | | | |--- no_of_week_nights <= 8.50 | | | | | | | | | |--- avg_price_per_room <= 130.11 | | | | | | | | | | |--- Day of the Week <= 1.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- Day of the Week > 1.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | |--- avg_price_per_room > 130.11 | | | | | | | | | | |--- avg_price_per_room <= 140.03 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | | |--- avg_price_per_room > 140.03 | | | | | | | | | | | |--- truncated branch of depth 6 | | | | | | | | |--- no_of_week_nights > 8.50 | | | | | | | | | |--- yearly_MONTHS_2017-08 <= 0.50 | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | | |--- yearly_MONTHS_2017-08 > 0.50 | | | | | | | | | | |--- Day of the Week <= 4.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | | |--- Day of the Week > 4.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | |--- lead_time > 25.50 | | | | | | | | |--- avg_price_per_room <= 81.94 | | | | | | | | | |--- avg_price_per_room <= 78.53 | | | | | | | | | | |--- avg_price_per_room <= 70.13 | | | | | | | | | | | |--- weights: [5.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 70.13 | | | | | | | | | | | |--- truncated branch of depth 9 | | | | | | | | | |--- avg_price_per_room > 78.53 | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | |--- avg_price_per_room > 81.94 | | | | | | | | | |--- room_type_reserved_Room_Type 7 <= 0.50 | | | | | | | | | | |--- yearly_MONTHS_2018-08 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 10 | | | | | | | | | | |--- yearly_MONTHS_2018-08 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | |--- room_type_reserved_Room_Type 7 > 0.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | |--- no_of_special_requests > 2.50 | | | | | | | |--- weights: [88.00, 0.00] class: 0 | | | | | |--- no_of_weekend_nights > 4.50 | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | |--- lead_time > 91.50 | | | | |--- avg_price_per_room <= 200.70 | | | | | |--- no_of_special_requests <= 2.50 | | | | | | |--- yearly_MONTHS_2018-07 <= 0.50 | | | | | | | |--- yearly_MONTHS_2018-06 <= 0.50 | | | | | | | | |--- yearly_MONTHS_2018-05 <= 0.50 | | | | | | | | | |--- avg_price_per_room <= 125.05 | | | | | | | | | | |--- avg_price_per_room <= 74.82 | | | | | | | | | | | |--- truncated branch of depth 9 | | | | | | | | | | |--- avg_price_per_room > 74.82 | | | | | | | | | | | |--- truncated branch of depth 21 | | | | | | | | | |--- avg_price_per_room > 125.05 | | | | | | | | | | |--- room_type_reserved_Room_Type 5 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 16 | | | | | | | | | | |--- room_type_reserved_Room_Type 5 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | |--- yearly_MONTHS_2018-05 > 0.50 | | | | | | | | | |--- lead_time <= 96.50 | | | | | | | | | | |--- lead_time <= 93.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 93.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- lead_time > 96.50 | | | | | | | | | | |--- weights: [24.00, 0.00] class: 0 | | | | | | | |--- yearly_MONTHS_2018-06 > 0.50 | | | | | | | | |--- avg_price_per_room <= 96.24 | | | | | | | | | |--- avg_price_per_room <= 89.31 | | | | | | | | | | |--- weights: [3.00, 0.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 89.31 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- avg_price_per_room > 96.24 | | | | | | | | | |--- weights: [29.00, 0.00] class: 0 | | | | | | |--- yearly_MONTHS_2018-07 > 0.50 | | | | | | | |--- no_of_week_nights <= 6.50 | | | | | | | | |--- no_of_week_nights <= 4.50 | | | | | | | | | |--- lead_time <= 143.50 | | | | | | | | | | |--- weights: [49.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 143.50 | | | | | | | | | | |--- no_of_week_nights <= 3.50 | | | | | | | | | | | |--- weights: [6.00, 0.00] class: 0 | | | | | | | | | | |--- no_of_week_nights > 3.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | |--- no_of_week_nights > 4.50 | | | | | | | | | |--- room_type_reserved_Room_Type 4 <= 0.50 | | | | | | | | | | |--- avg_price_per_room <= 105.83 | | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 105.83 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | |--- room_type_reserved_Room_Type 4 > 0.50 | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | |--- no_of_week_nights > 6.50 | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | |--- no_of_special_requests > 2.50 | | | | | | |--- weights: [117.00, 0.00] class: 0 | | | | |--- avg_price_per_room > 200.70 | | | | | |--- no_of_special_requests <= 2.50 | | | | | | |--- weights: [0.00, 19.00] class: 1 | | | | | |--- no_of_special_requests > 2.50 | | | | | | |--- weights: [3.00, 0.00] class: 0 |--- lead_time > 150.50 | |--- avg_price_per_room <= 100.06 | | |--- no_of_special_requests <= 0.50 | | | |--- market_segment_type_Online <= 0.50 | | | | |--- lead_time <= 348.50 | | | | | |--- lead_time <= 243.50 | | | | | | |--- avg_price_per_room <= 96.91 | | | | | | | |--- Day of the Week <= 5.50 | | | | | | | | |--- yearly_MONTHS_2018-05 <= 0.50 | | | | | | | | | |--- no_of_week_nights <= 7.50 | | | | | | | | | | |--- lead_time <= 213.00 | | | | | | | | | | | |--- truncated branch of depth 17 | | | | | | | | | | |--- lead_time > 213.00 | | | | | | | | | | | |--- weights: [32.00, 0.00] class: 0 | | | | | | | | | |--- no_of_week_nights > 7.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- yearly_MONTHS_2018-05 > 0.50 | | | | | | | | | |--- lead_time <= 179.50 | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 179.50 | | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | | |--- Day of the Week > 5.50 | | | | | | | | |--- avg_price_per_room <= 82.88 | | | | | | | | | |--- no_of_week_nights <= 2.50 | | | | | | | | | | |--- yearly_MONTHS_2019-07 <= 0.50 | | | | | | | | | | | |--- weights: [14.00, 0.00] class: 0 | | | | | | | | | | |--- yearly_MONTHS_2019-07 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | |--- no_of_week_nights > 2.50 | | | | | | | | | | |--- no_of_weekend_nights <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- no_of_weekend_nights > 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | |--- avg_price_per_room > 82.88 | | | | | | | | | |--- avg_price_per_room <= 89.05 | | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 89.05 | | | | | | | | | | |--- avg_price_per_room <= 89.55 | | | | | | | | | | | |--- weights: [3.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 89.55 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | |--- avg_price_per_room > 96.91 | | | | | | | |--- avg_price_per_room <= 97.88 | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | | |--- avg_price_per_room > 97.88 | | | | | | | | |--- yearly_MONTHS_2019-06 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2018-05 <= 0.50 | | | | | | | | | | |--- avg_price_per_room <= 99.50 | | | | | | | | | | | |--- weights: [3.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 99.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | |--- yearly_MONTHS_2018-05 > 0.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- yearly_MONTHS_2019-06 > 0.50 | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | |--- lead_time > 243.50 | | | | | | |--- avg_price_per_room <= 88.50 | | | | | | | |--- yearly_MONTHS_2019-05 <= 0.50 | | | | | | | | |--- avg_price_per_room <= 80.42 | | | | | | | | | |--- Day of the Week <= 5.50 | | | | | | | | | | |--- yearly_MONTHS_2019-07 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 7 | | | | | | | | | | |--- yearly_MONTHS_2019-07 > 0.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- Day of the Week > 5.50 | | | | | | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | |--- avg_price_per_room > 80.42 | | | | | | | | | |--- weights: [14.00, 0.00] class: 0 | | | | | | | |--- yearly_MONTHS_2019-05 > 0.50 | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | |--- avg_price_per_room > 88.50 | | | | | | | |--- type_of_meal_plan_Meal Plan 2 <= 0.50 | | | | | | | | |--- lead_time <= 327.00 | | | | | | | | | |--- no_of_week_nights <= 5.00 | | | | | | | | | | |--- lead_time <= 292.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | | |--- lead_time > 292.50 | | | | | | | | | | | |--- truncated branch of depth 5 | | | | | | | | | |--- no_of_week_nights > 5.00 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | |--- lead_time > 327.00 | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | |--- type_of_meal_plan_Meal Plan 2 > 0.50 | | | | | | | | |--- lead_time <= 310.50 | | | | | | | | | |--- weights: [5.00, 0.00] class: 0 | | | | | | | | |--- lead_time > 310.50 | | | | | | | | | |--- yearly_MONTHS_2018-09 <= 0.50 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | |--- yearly_MONTHS_2018-09 > 0.50 | | | | | | | | | | |--- weights: [1.00, 1.00] class: 0 | | | | |--- lead_time > 348.50 | | | | | |--- type_of_meal_plan_Meal Plan 2 <= 0.50 | | | | | | |--- avg_price_per_room <= 72.50 | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | |--- avg_price_per_room > 72.50 | | | | | | | |--- weights: [0.00, 30.00] class: 1 | | | | | |--- type_of_meal_plan_Meal Plan 2 > 0.50 | | | | | | |--- yearly_MONTHS_2019-07 <= 0.50 | | | | | | | |--- Day of the Week <= 1.50 | | | | | | | | |--- weights: [1.00, 1.00] class: 0 | | | | | | | |--- Day of the Week > 1.50 | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | |--- yearly_MONTHS_2019-07 > 0.50 | | | | | | | |--- no_of_week_nights <= 1.50 | | | | | | | | |--- weights: [1.00, 1.00] class: 0 | | | | | | | |--- no_of_week_nights > 1.50 | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | |--- market_segment_type_Online > 0.50 | | | | |--- avg_price_per_room <= 1.50 | | | | | |--- lead_time <= 295.50 | | | | | | |--- yearly_MONTHS_2017-09 <= 0.50 | | | | | | | |--- yearly_MONTHS_2017-10 <= 0.50 | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | |--- yearly_MONTHS_2017-10 > 0.50 | | | | | | | | |--- weights: [1.00, 1.00] class: 0 | | | | | | |--- yearly_MONTHS_2017-09 > 0.50 | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | |--- lead_time > 295.50 | | | | | | |--- weights: [0.00, 6.00] class: 1 | | | | |--- avg_price_per_room > 1.50 | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | |--- weights: [0.00, 497.00] class: 1 | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | |--- no_of_week_nights <= 1.50 | | | | | | | |--- avg_price_per_room <= 70.20 | | | | | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | | | | | |--- lead_time <= 237.00 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 237.00 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | |--- avg_price_per_room > 70.20 | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | |--- no_of_week_nights > 1.50 | | | | | | | |--- no_of_weekend_nights <= 0.50 | | | | | | | | |--- Day of the Week <= 3.50 | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | |--- Day of the Week > 3.50 | | | | | | | | | |--- weights: [0.00, 5.00] class: 1 | | | | | | | |--- no_of_weekend_nights > 0.50 | | | | | | | | |--- weights: [0.00, 30.00] class: 1 | | |--- no_of_special_requests > 0.50 | | | |--- no_of_weekend_nights <= 0.50 | | | | |--- lead_time <= 180.50 | | | | | |--- yearly_MONTHS_2019-06 <= 0.50 | | | | | | |--- yearly_MONTHS_2019-02 <= 0.50 | | | | | | | |--- yearly_MONTHS_2018-09 <= 0.50 | | | | | | | | |--- yearly_MONTHS_2019-03 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2018-11 <= 0.50 | | | | | | | | | | |--- room_type_reserved_Room_Type 4 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 9 | | | | | | | | | | |--- room_type_reserved_Room_Type 4 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | |--- yearly_MONTHS_2018-11 > 0.50 | | | | | | | | | | |--- no_of_special_requests <= 1.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | | |--- no_of_special_requests > 1.50 | | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | |--- yearly_MONTHS_2019-03 > 0.50 | | | | | | | | | |--- Day of the Week <= 3.50 | | | | | | | | | | |--- no_of_week_nights <= 3.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | | |--- no_of_week_nights > 3.50 | | | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | | | | |--- Day of the Week > 3.50 | | | | | | | | | | |--- lead_time <= 171.00 | | | | | | | | | | | |--- weights: [5.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 171.00 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | |--- yearly_MONTHS_2018-09 > 0.50 | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | |--- yearly_MONTHS_2019-02 > 0.50 | | | | | | | |--- lead_time <= 172.50 | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | |--- lead_time > 172.50 | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | |--- yearly_MONTHS_2019-06 > 0.50 | | | | | | |--- no_of_week_nights <= 4.00 | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | |--- no_of_week_nights > 4.00 | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | |--- lead_time > 180.50 | | | | | |--- no_of_special_requests <= 2.50 | | | | | | |--- market_segment_type_Online <= 0.50 | | | | | | | |--- avg_price_per_room <= 89.55 | | | | | | | | |--- weights: [12.00, 0.00] class: 0 | | | | | | | |--- avg_price_per_room > 89.55 | | | | | | | | |--- no_of_special_requests <= 1.50 | | | | | | | | | |--- no_of_week_nights <= 1.50 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- no_of_week_nights > 1.50 | | | | | | | | | | |--- yearly_MONTHS_2018-10 <= 0.50 | | | | | | | | | | | |--- weights: [0.00, 4.00] class: 1 | | | | | | | | | | |--- yearly_MONTHS_2018-10 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | |--- no_of_special_requests > 1.50 | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | |--- market_segment_type_Online > 0.50 | | | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | | | |--- avg_price_per_room <= 39.49 | | | | | | | | | |--- weights: [3.00, 0.00] class: 0 | | | | | | | | |--- avg_price_per_room > 39.49 | | | | | | | | | |--- yearly_MONTHS_2019-01 <= 0.50 | | | | | | | | | | |--- weights: [0.00, 148.00] class: 1 | | | | | | | | | |--- yearly_MONTHS_2019-01 > 0.50 | | | | | | | | | | |--- avg_price_per_room <= 81.90 | | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 81.90 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | | | |--- lead_time <= 300.00 | | | | | | | | | |--- no_of_week_nights <= 3.50 | | | | | | | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | | |--- no_of_week_nights > 3.50 | | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | | |--- lead_time > 300.00 | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | |--- no_of_special_requests > 2.50 | | | | | | |--- weights: [20.00, 0.00] class: 0 | | | |--- no_of_weekend_nights > 0.50 | | | | |--- type_of_meal_plan_Not Selected <= 0.50 | | | | | |--- lead_time <= 230.50 | | | | | | |--- yearly_MONTHS_2019-01 <= 0.50 | | | | | | | |--- no_of_week_nights <= 7.50 | | | | | | | | |--- yearly_MONTHS_2017-11 <= 0.50 | | | | | | | | | |--- yearly_MONTHS_2019-03 <= 0.50 | | | | | | | | | | |--- avg_price_per_room <= 72.87 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | | |--- avg_price_per_room > 72.87 | | | | | | | | | | | |--- truncated branch of depth 16 | | | | | | | | | |--- yearly_MONTHS_2019-03 > 0.50 | | | | | | | | | | |--- room_type_reserved_Room_Type 2 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- room_type_reserved_Room_Type 2 > 0.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | |--- yearly_MONTHS_2017-11 > 0.50 | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | |--- no_of_week_nights > 7.50 | | | | | | | | |--- avg_price_per_room <= 90.02 | | | | | | | | | |--- room_type_reserved_Room_Type 2 <= 0.50 | | | | | | | | | | |--- weights: [3.00, 0.00] class: 0 | | | | | | | | | |--- room_type_reserved_Room_Type 2 > 0.50 | | | | | | | | | | |--- avg_price_per_room <= 81.81 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | | |--- avg_price_per_room > 81.81 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- avg_price_per_room > 90.02 | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | |--- yearly_MONTHS_2019-01 > 0.50 | | | | | | | |--- no_of_weekend_nights <= 1.50 | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | | | | |--- no_of_weekend_nights > 1.50 | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | |--- lead_time > 230.50 | | | | | | |--- avg_price_per_room <= 77.29 | | | | | | | |--- lead_time <= 247.50 | | | | | | | | |--- Day of the Week <= 2.50 | | | | | | | | | |--- weights: [4.00, 0.00] class: 0 | | | | | | | | |--- Day of the Week > 2.50 | | | | | | | | | |--- lead_time <= 239.00 | | | | | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 239.00 | | | | | | | | | | |--- no_of_week_nights <= 3.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | | |--- no_of_week_nights > 3.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | |--- lead_time > 247.50 | | | | | | | | |--- weights: [22.00, 0.00] class: 0 | | | | | | |--- avg_price_per_room > 77.29 | | | | | | | |--- avg_price_per_room <= 79.81 | | | | | | | | |--- no_of_special_requests <= 2.50 | | | | | | | | | |--- Day of the Week <= 4.00 | | | | | | | | | | |--- weights: [0.00, 6.00] class: 1 | | | | | | | | | |--- Day of the Week > 4.00 | | | | | | | | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | |--- no_of_special_requests > 2.50 | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | |--- avg_price_per_room > 79.81 | | | | | | | | |--- lead_time <= 237.50 | | | | | | | | | |--- lead_time <= 231.50 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- lead_time > 231.50 | | | | | | | | | | |--- Day of the Week <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 2 | | | | | | | | | | |--- Day of the Week > 0.50 | | | | | | | | | | | |--- weights: [0.00, 6.00] class: 1 | | | | | | | | |--- lead_time > 237.50 | | | | | | | | | |--- yearly_MONTHS_2019-03 <= 0.50 | | | | | | | | | | |--- lead_time <= 346.50 | | | | | | | | | | | |--- truncated branch of depth 10 | | | | | | | | | | |--- lead_time > 346.50 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | |--- yearly_MONTHS_2019-03 > 0.50 | | | | | | | | | | |--- Day of the Week <= 3.00 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | | |--- Day of the Week > 3.00 | | | | | | | | | | | |--- weights: [0.00, 3.00] class: 1 | | | | |--- type_of_meal_plan_Not Selected > 0.50 | | | | | |--- no_of_special_requests <= 2.50 | | | | | | |--- no_of_week_nights <= 5.50 | | | | | | | |--- avg_price_per_room <= 62.97 | | | | | | | | |--- weights: [7.00, 0.00] class: 0 | | | | | | | |--- avg_price_per_room > 62.97 | | | | | | | | |--- yearly_MONTHS_2019-03 <= 0.50 | | | | | | | | | |--- lead_time <= 221.00 | | | | | | | | | | |--- avg_price_per_room <= 94.37 | | | | | | | | | | | |--- truncated branch of depth 11 | | | | | | | | | | |--- avg_price_per_room > 94.37 | | | | | | | | | | | |--- truncated branch of depth 10 | | | | | | | | | |--- lead_time > 221.00 | | | | | | | | | | |--- yearly_MONTHS_2019-06 <= 0.50 | | | | | | | | | | | |--- truncated branch of depth 12 | | | | | | | | | | |--- yearly_MONTHS_2019-06 > 0.50 | | | | | | | | | | | |--- weights: [5.00, 0.00] class: 0 | | | | | | | | |--- yearly_MONTHS_2019-03 > 0.50 | | | | | | | | | |--- lead_time <= 249.00 | | | | | | | | | | |--- lead_time <= 177.00 | | | | | | | | | | | |--- truncated branch of depth 4 | | | | | | | | | | |--- lead_time > 177.00 | | | | | | | | | | | |--- truncated branch of depth 3 | | | | | | | | | |--- lead_time > 249.00 | | | | | | | | | | |--- lead_time <= 296.00 | | | | | | | | | | | |--- weights: [3.00, 0.00] class: 0 | | | | | | | | | | |--- lead_time > 296.00 | | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | |--- no_of_week_nights > 5.50 | | | | | | | |--- avg_price_per_room <= 56.89 | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | |--- avg_price_per_room > 56.89 | | | | | | | | |--- yearly_MONTHS_2018-07 <= 0.50 | | | | | | | | | |--- weights: [0.00, 7.00] class: 1 | | | | | | | | |--- yearly_MONTHS_2018-07 > 0.50 | | | | | | | | | |--- avg_price_per_room <= 85.00 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- avg_price_per_room > 85.00 | | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | |--- no_of_special_requests > 2.50 | | | | | | |--- weights: [16.00, 0.00] class: 0 | |--- avg_price_per_room > 100.06 | | |--- no_of_special_requests <= 2.50 | | | |--- yearly_MONTHS_2018-12 <= 0.50 | | | | |--- yearly_MONTHS_2019-01 <= 0.50 | | | | | |--- yearly_MONTHS_2017-12 <= 0.50 | | | | | | |--- weights: [0.00, 2249.00] class: 1 | | | | | |--- yearly_MONTHS_2017-12 > 0.50 | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | |--- yearly_MONTHS_2019-01 > 0.50 | | | | | |--- no_of_special_requests <= 0.50 | | | | | | |--- weights: [3.00, 0.00] class: 0 | | | | | |--- no_of_special_requests > 0.50 | | | | | | |--- avg_price_per_room <= 105.72 | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | |--- avg_price_per_room > 105.72 | | | | | | | |--- Day of the Week <= 5.50 | | | | | | | | |--- weights: [0.00, 6.00] class: 1 | | | | | | | |--- Day of the Week > 5.50 | | | | | | | | |--- lead_time <= 225.50 | | | | | | | | | |--- weights: [0.00, 1.00] class: 1 | | | | | | | | |--- lead_time > 225.50 | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | |--- yearly_MONTHS_2018-12 > 0.50 | | | | |--- no_of_special_requests <= 0.50 | | | | | |--- weights: [33.00, 0.00] class: 0 | | | | |--- no_of_special_requests > 0.50 | | | | | |--- avg_price_per_room <= 104.78 | | | | | | |--- weights: [2.00, 0.00] class: 0 | | | | | |--- avg_price_per_room > 104.78 | | | | | | |--- type_of_meal_plan_Meal Plan 2 <= 0.50 | | | | | | | |--- market_segment_type_Online <= 0.50 | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | |--- market_segment_type_Online > 0.50 | | | | | | | | |--- no_of_week_nights <= 5.50 | | | | | | | | | |--- avg_price_per_room <= 145.15 | | | | | | | | | | |--- weights: [0.00, 12.00] class: 1 | | | | | | | | | |--- avg_price_per_room > 145.15 | | | | | | | | | | |--- room_type_reserved_Room_Type 6 <= 0.50 | | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | | |--- room_type_reserved_Room_Type 6 > 0.50 | | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | | | |--- no_of_week_nights > 5.50 | | | | | | | | | |--- no_of_week_nights <= 7.00 | | | | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | | | | | | | | |--- no_of_week_nights > 7.00 | | | | | | | | | | |--- weights: [0.00, 2.00] class: 1 | | | | | | |--- type_of_meal_plan_Meal Plan 2 > 0.50 | | | | | | | |--- weights: [1.00, 0.00] class: 0 | | |--- no_of_special_requests > 2.50 | | | |--- weights: [131.00, 0.00] class: 0
importances = model.feature_importances_
indices = np.argsort(importances)
plt.figure(figsize=(12, 12))
plt.title("Feature Importances")
plt.barh(range(len(indices)), importances[indices], color="violet", align="center")
plt.yticks(range(len(indices)), [feature_names[i] for i in indices])
plt.xlabel("Relative Importance")
plt.show()
# Choose the type of classifier.
estimator = DecisionTreeClassifier(random_state=1, )#class_weight={0: 0.34, 1: 0.66}
# Grid of parameters to choose from
parameters = {
"max_depth": [5, 10, 15, None],
"criterion": ["entropy", "gini"],
"splitter": ["best", "random"],
"min_impurity_decrease": [0.00001, 0.0001, 0.01],
}
# Type of scoring used to compare parameter combinations
scorer = make_scorer(recall_score)
# Run the grid search
grid_obj = GridSearchCV(estimator, parameters, scoring=scorer, cv=5)
grid_obj = grid_obj.fit(X_train, y_train)
# Set the clf to the best combination of parameters
estimator = grid_obj.best_estimator_
# Fit the best algorithm to the data.
estimator.fit(X_train, y_train)
DecisionTreeClassifier(criterion='entropy', max_depth=5,
min_impurity_decrease=0.01, random_state=1)
confusion_matrix_sklearn(estimator, X_train, y_train)
decision_tree_tune_perf_train = get_recall_score(estimator, X_train, y_train)
print("Recall Score:", decision_tree_tune_perf_train)
Recall Score: 0.7362662674190947
confusion_matrix_sklearn(estimator, X_test, y_test)
decision_tree_tune_perf_test = get_recall_score(estimator, X_test, y_test)
print("Recall Score:", decision_tree_tune_perf_test)
Recall Score: 0.7419354838709677
plt.figure(figsize=(15, 10))
out = tree.plot_tree(
estimator,
feature_names=feature_names,
filled=True,
fontsize=9,
node_ids=False,
class_names=None,
)
for o in out:
arrow = o.arrow_patch
if arrow is not None:
arrow.set_edgecolor("black")
arrow.set_linewidth(1)
plt.show()
# Text report showing the rules of a decision tree -
print(tree.export_text(estimator, feature_names=feature_names, show_weights=True))
|--- lead_time <= 150.50 | |--- no_of_special_requests <= 0.50 | | |--- market_segment_type_Online <= 0.50 | | | |--- weights: [2820.00, 287.00] class: 0 | | |--- market_segment_type_Online > 0.50 | | | |--- lead_time <= 8.50 | | | | |--- weights: [959.00, 218.00] class: 0 | | | |--- lead_time > 8.50 | | | | |--- weights: [2139.00, 3103.00] class: 1 | |--- no_of_special_requests > 0.50 | | |--- lead_time <= 7.50 | | | |--- weights: [2072.00, 66.00] class: 0 | | |--- lead_time > 7.50 | | | |--- no_of_special_requests <= 1.50 | | | | |--- weights: [4984.00, 1491.00] class: 0 | | | |--- no_of_special_requests > 1.50 | | | | |--- weights: [2815.00, 228.00] class: 0 |--- lead_time > 150.50 | |--- avg_price_per_room <= 100.06 | | |--- weights: [876.00, 1018.00] class: 1 | |--- avg_price_per_room > 100.06 | | |--- no_of_special_requests <= 2.50 | | | |--- weights: [45.00, 2272.00] class: 1 | | |--- no_of_special_requests > 2.50 | | | |--- weights: [131.00, 0.00] class: 0
# importance of features in the tree building ( The importance of a feature is computed as the
# (normalized) total reduction of the 'criterion' brought by that feature. It is also known as the Gini importance )
print(
pd.DataFrame(
estimator.feature_importances_, columns=["Imp"], index=X_train.columns
).sort_values(by="Imp", ascending=False)
)
# Here we will see that importance of features has increased
Imp lead_time 0.479674 no_of_special_requests 0.253208 market_segment_type_Online 0.177435 avg_price_per_room 0.089684 no_of_weekend_nights 0.000000 yearly_MONTHS_2018-11 0.000000 yearly_MONTHS_2018-04 0.000000 yearly_MONTHS_2018-05 0.000000 yearly_MONTHS_2018-06 0.000000 yearly_MONTHS_2018-07 0.000000 yearly_MONTHS_2018-08 0.000000 yearly_MONTHS_2018-09 0.000000 yearly_MONTHS_2018-10 0.000000 yearly_MONTHS_2019-01 0.000000 yearly_MONTHS_2018-12 0.000000 yearly_MONTHS_2018-02 0.000000 yearly_MONTHS_2019-02 0.000000 yearly_MONTHS_2019-03 0.000000 yearly_MONTHS_2019-04 0.000000 yearly_MONTHS_2019-05 0.000000 yearly_MONTHS_2019-06 0.000000 yearly_MONTHS_2019-07 0.000000 yearly_MONTHS_2018-03 0.000000 yearly_MONTHS_2017-11 0.000000 yearly_MONTHS_2018-01 0.000000 room_type_reserved_Room_Type 5 0.000000 Day of the Week 0.000000 type_of_meal_plan_Meal Plan 2 0.000000 type_of_meal_plan_Meal Plan 3 0.000000 type_of_meal_plan_Not Selected 0.000000 room_type_reserved_Room_Type 2 0.000000 room_type_reserved_Room_Type 3 0.000000 room_type_reserved_Room_Type 4 0.000000 room_type_reserved_Room_Type 6 0.000000 yearly_MONTHS_2017-12 0.000000 room_type_reserved_Room_Type 7 0.000000 market_segment_type_Complementary 0.000000 market_segment_type_Corporate 0.000000 yearly_MONTHS_2017-08 0.000000 yearly_MONTHS_2017-09 0.000000 yearly_MONTHS_2017-10 0.000000 no_of_week_nights 0.000000 yearly_MONTHS_2019-08 0.000000
importances = estimator.feature_importances_
indices = np.argsort(importances)
plt.figure(figsize=(12, 12))
plt.title("Feature Importances")
plt.barh(range(len(indices)), importances[indices], color="violet", align="center")
plt.yticks(range(len(indices)), [feature_names[i] for i in indices])
plt.xlabel("Relative Importance")
plt.show()
clf = DecisionTreeClassifier(random_state=1,)# class_weight={0: 0.34, 1: 0.66}
path = clf.cost_complexity_pruning_path(X_train, y_train)
ccp_alphas, impurities = path.ccp_alphas, path.impurities
pd.DataFrame(path)
| ccp_alphas | impurities | |
|---|---|---|
| 0 | 0.000000 | 0.004231 |
| 1 | 0.000000 | 0.004231 |
| 2 | 0.000000 | 0.004231 |
| 3 | 0.000003 | 0.004234 |
| 4 | 0.000007 | 0.004240 |
| ... | ... | ... |
| 1836 | 0.009342 | 0.301661 |
| 1837 | 0.012485 | 0.314146 |
| 1838 | 0.012767 | 0.326913 |
| 1839 | 0.025269 | 0.377452 |
| 1840 | 0.071469 | 0.448921 |
1841 rows × 2 columns
fig, ax = plt.subplots(figsize=(10, 5))
ax.plot(ccp_alphas[:-1], impurities[:-1], marker="o", drawstyle="steps-post")
ax.set_xlabel("effective alpha")
ax.set_ylabel("total impurity of leaves")
ax.set_title("Total Impurity vs effective alpha for training set")
plt.show()
Next, we train a decision tree using the effective alphas. The last value in ccp_alphas is the alpha value that prunes the whole tree, leaving the tree, clfs[-1], with one node.
clfs = []
for ccp_alpha in ccp_alphas:
clf = DecisionTreeClassifier(
random_state=1, ccp_alpha=ccp_alpha, #class_weight={0: 0.34, 1: 0.66}
)
clf.fit(X_train, y_train)
clfs.append(clf)
print(
"Number of nodes in the last tree is: {} with ccp_alpha: {}".format(
clfs[-1].tree_.node_count, ccp_alphas[-1]
)
)
Number of nodes in the last tree is: 1 with ccp_alpha: 0.07146926305599055
For the remainder, we remove the last element in clfs and ccp_alphas, because it is the trivial tree with only one node. Here we show that the number of nodes and tree depth decreases as alpha increases.
ccp_alpha
clfs = []
for ccp_alpha in ccp_alphas:
clf = DecisionTreeClassifier(random_state=1, ccp_alpha=ccp_alpha)
clf.fit(X_train, y_train)
clfs.append(clf)
print("Number of nodes in the last tree is: {} with ccp_alpha: {}".format(
clfs[-1].tree_.node_count, ccp_alphas[-1]))
Number of nodes in the last tree is: 1 with ccp_alpha: 0.07146926305599055
clfs = clfs[:-1]
ccp_alphas = ccp_alphas[:-1]
node_counts = [clf.tree_.node_count for clf in clfs]
depth = [clf.tree_.max_depth for clf in clfs]
fig, ax = plt.subplots(2, 1, figsize=(10, 7))
ax[0].plot(ccp_alphas, node_counts, marker="o", drawstyle="steps-post")
ax[0].set_xlabel("alpha")
ax[0].set_ylabel("number of nodes")
ax[0].set_title("Number of nodes vs alpha")
ax[1].plot(ccp_alphas, depth, marker="o", drawstyle="steps-post")
ax[1].set_xlabel("alpha")
ax[1].set_ylabel("depth of tree")
ax[1].set_title("Depth vs alpha")
fig.tight_layout()
recall_train = []
for clf in clfs:
pred_train = clf.predict(X_train)
values_train = recall_score(y_train, pred_train)
recall_train.append(values_train)
recall_test = []
for clf in clfs:
pred_test = clf.predict(X_test)
values_test = recall_score(y_test, pred_test)
recall_test.append(values_test)
train_scores = [clf.score(X_train, y_train) for clf in clfs]
test_scores = [clf.score(X_test, y_test) for clf in clfs]
fig, ax = plt.subplots(figsize=(15, 5))
ax.set_xlabel("alpha")
ax.set_ylabel("Recall")
ax.set_title("Recall vs alpha for training and testing sets")
ax.plot(
ccp_alphas, recall_train, marker="o", label="train", drawstyle="steps-post",
)
ax.plot(ccp_alphas, recall_test, marker="o", label="test", drawstyle="steps-post")
ax.legend()
plt.show()
Maximum value of Recall is at 0.025 alpha, but if we choose decision tree will only have a root node and we would lose the buisness rules, instead we can choose alpha 0.004 retaining information and getting higher recall.
# creating the model where we get highest train and test recall
index_best_model = np.argmax(recall_test)
best_model = clfs[index_best_model]
print(best_model)
DecisionTreeClassifier(ccp_alpha=0.012484589094136037, random_state=1)
best_model.fit(X_train, y_train)
DecisionTreeClassifier(ccp_alpha=0.012484589094136037, random_state=1)
confusion_matrix_sklearn(best_model, X_train, y_train)
print("Recall Score:", get_recall_score(best_model, X_train, y_train))
Recall Score: 0.7613727974202464
confusion_matrix_sklearn(best_model, X_test, y_test)
print("Recall Score:", get_recall_score(best_model, X_test, y_test))
Recall Score: 0.7672934276349836
plt.figure(figsize=(5, 5))
out = tree.plot_tree(
best_model,
feature_names=feature_names,
filled=True,
fontsize=9,
node_ids=False,
class_names=None,
)
for o in out:
arrow = o.arrow_patch
if arrow is not None:
arrow.set_edgecolor("black")
arrow.set_linewidth(1)
plt.show()
best_model2 = DecisionTreeClassifier(
ccp_alpha= 0.004, random_state=1 #class_weight={0: 0.15, 1: 0.85}
)
best_model2.fit(X_train, y_train)
DecisionTreeClassifier(ccp_alpha=0.004, random_state=1)
confusion_matrix_sklearn(best_model2, X_train, y_train)
decision_tree_postpruned_perf_train = get_recall_score(best_model2, X_train, y_train)
print("Recall Score:", decision_tree_postpruned_perf_train)
Recall Score: 0.6778763100310953
confusion_matrix_sklearn(best_model2, X_test, y_test)
decision_tree_postpruned_perf_test = get_recall_score(best_model2, X_test, y_test)
print("Recall Score:", decision_tree_postpruned_perf_test)
Recall Score: 0.6860445057788511
plt.figure(figsize=(15, 10))
out = tree.plot_tree(
best_model2,
feature_names=feature_names,
filled=True,
fontsize=9,
node_ids=False,
class_names=None,
)
for o in out:
arrow = o.arrow_patch
if arrow is not None:
arrow.set_edgecolor("black")
arrow.set_linewidth(1)
plt.show()
# Text report showing the rules of a decision tree -
print(tree.export_text(best_model2, feature_names=feature_names, show_weights=True))
|--- lead_time <= 150.50 | |--- no_of_special_requests <= 0.50 | | |--- market_segment_type_Online <= 0.50 | | | |--- weights: [2820.00, 287.00] class: 0 | | |--- market_segment_type_Online > 0.50 | | | |--- lead_time <= 9.50 | | | | |--- weights: [1011.00, 251.00] class: 0 | | | |--- lead_time > 9.50 | | | | |--- weights: [2087.00, 3070.00] class: 1 | |--- no_of_special_requests > 0.50 | | |--- weights: [9871.00, 1785.00] class: 0 |--- lead_time > 150.50 | |--- avg_price_per_room <= 100.06 | | |--- no_of_special_requests <= 0.50 | | | |--- market_segment_type_Online <= 0.50 | | | | |--- weights: [233.00, 108.00] class: 0 | | | |--- market_segment_type_Online > 0.50 | | | | |--- weights: [8.00, 544.00] class: 1 | | |--- no_of_special_requests > 0.50 | | | |--- weights: [635.00, 366.00] class: 0 | |--- avg_price_per_room > 100.06 | | |--- no_of_special_requests <= 2.50 | | | |--- weights: [45.00, 2272.00] class: 1 | | |--- no_of_special_requests > 2.50 | | | |--- weights: [131.00, 0.00] class: 0
# importance of features in the tree building ( The importance of a feature is computed as the
# (normalized) total reduction of the 'criterion' brought by that feature. It is also known as the Gini importance )
print(
pd.DataFrame(
best_model2.feature_importances_, columns=["Imp"], index=X_train.columns
).sort_values(by="Imp", ascending=False)
)
Imp lead_time 0.497056 market_segment_type_Online 0.219175 no_of_special_requests 0.208182 avg_price_per_room 0.075588 no_of_weekend_nights 0.000000 yearly_MONTHS_2018-11 0.000000 yearly_MONTHS_2018-04 0.000000 yearly_MONTHS_2018-05 0.000000 yearly_MONTHS_2018-06 0.000000 yearly_MONTHS_2018-07 0.000000 yearly_MONTHS_2018-08 0.000000 yearly_MONTHS_2018-09 0.000000 yearly_MONTHS_2018-10 0.000000 yearly_MONTHS_2019-01 0.000000 yearly_MONTHS_2018-12 0.000000 yearly_MONTHS_2018-02 0.000000 yearly_MONTHS_2019-02 0.000000 yearly_MONTHS_2019-03 0.000000 yearly_MONTHS_2019-04 0.000000 yearly_MONTHS_2019-05 0.000000 yearly_MONTHS_2019-06 0.000000 yearly_MONTHS_2019-07 0.000000 yearly_MONTHS_2018-03 0.000000 yearly_MONTHS_2017-11 0.000000 yearly_MONTHS_2018-01 0.000000 room_type_reserved_Room_Type 5 0.000000 Day of the Week 0.000000 type_of_meal_plan_Meal Plan 2 0.000000 type_of_meal_plan_Meal Plan 3 0.000000 type_of_meal_plan_Not Selected 0.000000 room_type_reserved_Room_Type 2 0.000000 room_type_reserved_Room_Type 3 0.000000 room_type_reserved_Room_Type 4 0.000000 room_type_reserved_Room_Type 6 0.000000 yearly_MONTHS_2017-12 0.000000 room_type_reserved_Room_Type 7 0.000000 market_segment_type_Complementary 0.000000 market_segment_type_Corporate 0.000000 yearly_MONTHS_2017-08 0.000000 yearly_MONTHS_2017-09 0.000000 yearly_MONTHS_2017-10 0.000000 no_of_week_nights 0.000000 yearly_MONTHS_2019-08 0.000000
importances = best_model2.feature_importances_
indices = np.argsort(importances)
plt.figure(figsize=(12, 12))
plt.title("Feature Importances")
plt.barh(range(len(indices)), importances[indices], color="violet", align="center")
plt.yticks(range(len(indices)), [feature_names[i] for i in indices])
plt.xlabel("Relative Importance")
plt.show()
# training performance comparison
models_train_comp_df = pd.DataFrame(
[
decision_tree_perf_train,
decision_tree_tune_perf_train,
decision_tree_postpruned_perf_train,
],
columns=["Recall on training set"],
)
print("Training performance comparison:")
models_train_comp_df
Training performance comparison:
| Recall on training set | |
|---|---|
| 0 | 0.988023 |
| 1 | 0.736266 |
| 2 | 0.677876 |
# testing performance comparison
models_test_comp_df = pd.DataFrame(
[
decision_tree_perf_test,
decision_tree_tune_perf_test,
decision_tree_postpruned_perf_test,
],
columns=["Recall on testing set"],
)
print("Test performance comparison:")
models_test_comp_df
Test performance comparison:
| Recall on testing set | |
|---|---|
| 0 | 0.700880 |
| 1 | 0.741935 |
| 2 | 0.686045 |
According to the decision tree model -
a) If a customer books with lead time less than 150 days and number of special requests is less than ~5 with the market segment online, then there is a very high chance that the customer is going to cancel the booking. b) If the room price >100 and number of special requests is greater than 2.5 then customer is less likely to cancel the booking.
Potential Customers - Employ the predictive model to predict potential customers (customers who can book the room), Offer limited-time coupons/discounts on a real-time basis only to those customers. This can also be employed for the customers in months like July, August, April May, as in those months, the traffic is higher so these months have potential confirming customers.
It is observed that less cancellations are seen on the Wednesday, Tuesday and Sunday, - the hotel should initiate schemes/offers on the special days with minimum lead time to attract more customers on such days.
December and January were the months where the hotels saw the lowest booking cancellations, with further data it should be investigated what portfolios were running in those months and an inspiration to create more such portfolios can be drawn and implemented.
Customer retention - Member Loyalty programs initiatives like special discounts, coupons, etc can be provided.
Better resource management - Tuesday, Wednesday and Sunday is when the hotel sees the most traffic, resources such as customer care services can be allocated more for these days.
Hotel should make more complementary packages for repeated guests who come more frequently. Like give them complementary spa work.